[ 569.933555] env[68856]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 570.579578] env[68906]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 571.914396] env[68906]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' {{(pid=68906) initialize /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:44}} [ 571.914754] env[68906]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' {{(pid=68906) initialize /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:44}} [ 571.914810] env[68906]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' {{(pid=68906) initialize /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:44}} [ 571.915120] env[68906]: INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs [ 572.119900] env[68906]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm {{(pid=68906) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} [ 572.130526] env[68906]: DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.011s {{(pid=68906) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} [ 572.239724] env[68906]: INFO nova.virt.driver [None req-ff0f8a98-9c23-4294-b9bd-56cfcf584ec2 None None] Loading compute driver 'vmwareapi.VMwareVCDriver' [ 572.317841] env[68906]: DEBUG oslo_concurrency.lockutils [-] Acquiring lock "oslo_vmware_api_lock" by "oslo_vmware.api.VMwareAPISession._create_session" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 572.318017] env[68906]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" acquired by "oslo_vmware.api.VMwareAPISession._create_session" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 572.318127] env[68906]: DEBUG oslo_vmware.service [-] Creating suds client with soap_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk' and wsdl_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk/vimService.wsdl' {{(pid=68906) __init__ /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:242}} [ 575.218836] env[68906]: DEBUG oslo_vmware.service [-] Invoking ServiceInstance.RetrieveServiceContent with opID=oslo.vmware-79fb2140-fe8c-4103-811f-0fd0b28e3d92 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 575.235264] env[68906]: DEBUG oslo_vmware.api [-] Logging into host: vc1.osci.c.eu-de-1.cloud.sap. {{(pid=68906) _create_session /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:242}} [ 575.235407] env[68906]: DEBUG oslo_vmware.service [-] Invoking SessionManager.Login with opID=oslo.vmware-dee012da-6d5b-4489-9cec-0c64144b8c50 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 575.278633] env[68906]: INFO oslo_vmware.api [-] Successfully established new session; session ID is e64b1. [ 575.278762] env[68906]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" "released" by "oslo_vmware.api.VMwareAPISession._create_session" :: held 2.961s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 575.279354] env[68906]: INFO nova.virt.vmwareapi.driver [None req-ff0f8a98-9c23-4294-b9bd-56cfcf584ec2 None None] VMware vCenter version: 7.0.3 [ 575.282720] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2292f96d-c21f-4ce2-b48b-bd97129a9c1e {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 575.299866] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93e85f5c-793e-4bb8-afb5-68d78fc15886 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 575.309980] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71940bd2-a102-4d2a-8801-e7a08db1b829 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 575.316511] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7266efe-efbc-4ce5-a199-2475585339fa {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 575.329499] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d5dac84d-b0c5-4956-adb2-f7f08cdaefe5 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 575.335328] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e0e2d639-98f8-44d0-a232-705b7170439d {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 575.365700] env[68906]: DEBUG oslo_vmware.service [-] Invoking ExtensionManager.FindExtension with opID=oslo.vmware-709dba69-41cd-4f0a-b3ec-b65e77c4bcd1 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 575.370574] env[68906]: DEBUG nova.virt.vmwareapi.driver [None req-ff0f8a98-9c23-4294-b9bd-56cfcf584ec2 None None] Extension org.openstack.compute already exists. {{(pid=68906) _register_openstack_extension /opt/stack/nova/nova/virt/vmwareapi/driver.py:224}} [ 575.373196] env[68906]: INFO nova.compute.provider_config [None req-ff0f8a98-9c23-4294-b9bd-56cfcf584ec2 None None] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access. [ 575.391328] env[68906]: DEBUG nova.context [None req-ff0f8a98-9c23-4294-b9bd-56cfcf584ec2 None None] Found 2 cells: 00000000-0000-0000-0000-000000000000(cell0),b8262c3e-187a-4d14-9c42-8f02010df9c1(cell1) {{(pid=68906) load_cells /opt/stack/nova/nova/context.py:464}} [ 575.393221] env[68906]: DEBUG oslo_concurrency.lockutils [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 575.393450] env[68906]: DEBUG oslo_concurrency.lockutils [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 575.394159] env[68906]: DEBUG oslo_concurrency.lockutils [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 575.394569] env[68906]: DEBUG oslo_concurrency.lockutils [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] Acquiring lock "b8262c3e-187a-4d14-9c42-8f02010df9c1" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 575.394763] env[68906]: DEBUG oslo_concurrency.lockutils [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] Lock "b8262c3e-187a-4d14-9c42-8f02010df9c1" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 575.395710] env[68906]: DEBUG oslo_concurrency.lockutils [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] Lock "b8262c3e-187a-4d14-9c42-8f02010df9c1" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 575.420677] env[68906]: INFO dbcounter [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] Registered counter for database nova_cell0 [ 575.429010] env[68906]: INFO dbcounter [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] Registered counter for database nova_cell1 [ 575.432007] env[68906]: DEBUG oslo_db.sqlalchemy.engines [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=68906) _check_effective_sql_mode /opt/stack/data/venv/lib/python3.10/site-packages/oslo_db/sqlalchemy/engines.py:342}} [ 575.432596] env[68906]: DEBUG oslo_db.sqlalchemy.engines [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=68906) _check_effective_sql_mode /opt/stack/data/venv/lib/python3.10/site-packages/oslo_db/sqlalchemy/engines.py:342}} [ 575.436733] env[68906]: DEBUG dbcounter [-] [68906] Writer thread running {{(pid=68906) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:102}} [ 575.437582] env[68906]: DEBUG dbcounter [-] [68906] Writer thread running {{(pid=68906) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:102}} [ 575.439641] env[68906]: ERROR nova.db.main.api [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] No DB access allowed in nova-compute: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 575.439641] env[68906]: result = function(*args, **kwargs) [ 575.439641] env[68906]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 575.439641] env[68906]: return func(*args, **kwargs) [ 575.439641] env[68906]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 575.439641] env[68906]: result = fn(*args, **kwargs) [ 575.439641] env[68906]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 575.439641] env[68906]: return f(*args, **kwargs) [ 575.439641] env[68906]: File "/opt/stack/nova/nova/objects/service.py", line 548, in _db_service_get_minimum_version [ 575.439641] env[68906]: return db.service_get_minimum_version(context, binaries) [ 575.439641] env[68906]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 575.439641] env[68906]: _check_db_access() [ 575.439641] env[68906]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 575.439641] env[68906]: stacktrace = ''.join(traceback.format_stack()) [ 575.439641] env[68906]: [ 575.440747] env[68906]: ERROR nova.db.main.api [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] No DB access allowed in nova-compute: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 575.440747] env[68906]: result = function(*args, **kwargs) [ 575.440747] env[68906]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 575.440747] env[68906]: return func(*args, **kwargs) [ 575.440747] env[68906]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 575.440747] env[68906]: result = fn(*args, **kwargs) [ 575.440747] env[68906]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 575.440747] env[68906]: return f(*args, **kwargs) [ 575.440747] env[68906]: File "/opt/stack/nova/nova/objects/service.py", line 548, in _db_service_get_minimum_version [ 575.440747] env[68906]: return db.service_get_minimum_version(context, binaries) [ 575.440747] env[68906]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 575.440747] env[68906]: _check_db_access() [ 575.440747] env[68906]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 575.440747] env[68906]: stacktrace = ''.join(traceback.format_stack()) [ 575.440747] env[68906]: [ 575.441150] env[68906]: WARNING nova.objects.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] Failed to get minimum service version for cell 00000000-0000-0000-0000-000000000000 [ 575.441264] env[68906]: WARNING nova.objects.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] Failed to get minimum service version for cell b8262c3e-187a-4d14-9c42-8f02010df9c1 [ 575.441705] env[68906]: DEBUG oslo_concurrency.lockutils [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] Acquiring lock "singleton_lock" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 575.441864] env[68906]: DEBUG oslo_concurrency.lockutils [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] Acquired lock "singleton_lock" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 575.442117] env[68906]: DEBUG oslo_concurrency.lockutils [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] Releasing lock "singleton_lock" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 575.442445] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] Full set of CONF: {{(pid=68906) _wait_for_exit_or_signal /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/service.py:362}} [ 575.442584] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] ******************************************************************************** {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2589}} [ 575.442710] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] Configuration options gathered from: {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2590}} [ 575.442844] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] command line args: ['--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-cpu-common.conf', '--config-file', '/etc/nova/nova-cpu-1.conf'] {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2591}} [ 575.443050] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2592}} [ 575.443185] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] ================================================================================ {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2594}} [ 575.443395] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] allow_resize_to_same_host = True {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.443565] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] arq_binding_timeout = 300 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.443696] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] backdoor_port = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.443821] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] backdoor_socket = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.443981] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] block_device_allocate_retries = 60 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.444171] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] block_device_allocate_retries_interval = 3 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.444388] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cert = self.pem {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.444565] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] compute_driver = vmwareapi.VMwareVCDriver {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.444748] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] compute_monitors = [] {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.444904] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] config_dir = [] {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.445093] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] config_drive_format = iso9660 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.445229] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.445394] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] config_source = [] {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.445558] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] console_host = devstack {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.445719] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] control_exchange = nova {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.445876] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cpu_allocation_ratio = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.446042] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] daemon = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.446217] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] debug = True {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.446371] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] default_access_ip_network_name = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.446537] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] default_availability_zone = nova {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.446695] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] default_ephemeral_format = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.446858] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] default_green_pool_size = 1000 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.447112] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.447333] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] default_schedule_zone = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.447516] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] disk_allocation_ratio = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.447681] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] enable_new_services = True {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.447866] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] enabled_apis = ['osapi_compute'] {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.448049] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] enabled_ssl_apis = [] {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.448261] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] flat_injected = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.448420] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] force_config_drive = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.448581] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] force_raw_images = True {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.448750] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] graceful_shutdown_timeout = 5 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.448908] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] heal_instance_info_cache_interval = 60 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.449134] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] host = cpu-1 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.449334] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] initial_cpu_allocation_ratio = 4.0 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.449501] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] initial_disk_allocation_ratio = 1.0 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.449664] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] initial_ram_allocation_ratio = 1.0 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.449877] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] injected_network_template = /opt/stack/nova/nova/virt/interfaces.template {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.450050] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] instance_build_timeout = 0 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.450215] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] instance_delete_interval = 300 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.450383] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] instance_format = [instance: %(uuid)s] {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.450555] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] instance_name_template = instance-%08x {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.450707] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] instance_usage_audit = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.450874] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] instance_usage_audit_period = month {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.451050] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] instance_uuid_format = [instance: %(uuid)s] {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.451218] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] instances_path = /opt/stack/data/nova/instances {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.451386] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] internal_service_availability_zone = internal {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.451543] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] key = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.451701] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] live_migration_retry_count = 30 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.451862] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] log_config_append = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.452036] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] log_date_format = %Y-%m-%d %H:%M:%S {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.452218] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] log_dir = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.452398] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] log_file = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.452530] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] log_options = True {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.452692] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] log_rotate_interval = 1 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.452860] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] log_rotate_interval_type = days {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.453034] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] log_rotation_type = none {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.453168] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] logging_context_format_string = %(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.453295] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] logging_debug_format_suffix = {{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}} {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.453474] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] logging_default_format_string = %(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.453630] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] logging_exception_prefix = ERROR %(name)s %(instance)s {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.453758] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.453917] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] long_rpc_timeout = 1800 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.454086] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] max_concurrent_builds = 10 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.454245] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] max_concurrent_live_migrations = 1 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.454405] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] max_concurrent_snapshots = 5 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.454559] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] max_local_block_devices = 3 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.454714] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] max_logfile_count = 30 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.454872] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] max_logfile_size_mb = 200 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.455043] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] maximum_instance_delete_attempts = 5 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.455214] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] metadata_listen = 0.0.0.0 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.455383] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] metadata_listen_port = 8775 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.455547] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] metadata_workers = 2 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.455709] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] migrate_max_retries = -1 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.455868] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] mkisofs_cmd = genisoimage {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.456083] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] my_block_storage_ip = 10.180.1.21 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.456242] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] my_ip = 10.180.1.21 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.456438] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] network_allocate_retries = 0 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.456584] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.456747] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] osapi_compute_listen = 0.0.0.0 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.456908] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] osapi_compute_listen_port = 8774 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.457086] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] osapi_compute_unique_server_name_scope = {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.457254] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] osapi_compute_workers = 2 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.457416] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] password_length = 12 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.457578] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] periodic_enable = True {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.457735] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] periodic_fuzzy_delay = 60 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.457900] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] pointer_model = usbtablet {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.458076] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] preallocate_images = none {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.458266] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] publish_errors = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.458400] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] pybasedir = /opt/stack/nova {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.458559] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] ram_allocation_ratio = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.458718] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] rate_limit_burst = 0 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.458883] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] rate_limit_except_level = CRITICAL {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.459054] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] rate_limit_interval = 0 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.459237] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] reboot_timeout = 0 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.459409] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] reclaim_instance_interval = 0 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.459565] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] record = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.459733] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] reimage_timeout_per_gb = 60 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.459897] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] report_interval = 120 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.460066] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] rescue_timeout = 0 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.460227] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] reserved_host_cpus = 0 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.460387] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] reserved_host_disk_mb = 0 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.460543] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] reserved_host_memory_mb = 512 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.460699] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] reserved_huge_pages = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.460855] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] resize_confirm_window = 0 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.461024] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] resize_fs_using_block_device = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.461188] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] resume_guests_state_on_host_boot = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.461359] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] rootwrap_config = /etc/nova/rootwrap.conf {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.461519] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] rpc_response_timeout = 60 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.461677] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] run_external_periodic_tasks = True {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.461841] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] running_deleted_instance_action = reap {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.461999] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] running_deleted_instance_poll_interval = 1800 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.462170] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] running_deleted_instance_timeout = 0 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.462327] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] scheduler_instance_sync_interval = 120 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.462494] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] service_down_time = 720 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.462657] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] servicegroup_driver = db {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.462815] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] shelved_offload_time = 0 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.462969] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] shelved_poll_interval = 3600 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.463145] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] shutdown_timeout = 0 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.463306] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] source_is_ipv6 = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.463464] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] ssl_only = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.463702] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] state_path = /opt/stack/data/n-cpu-1 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.463865] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] sync_power_state_interval = 600 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.464031] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] sync_power_state_pool_size = 1000 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.464198] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] syslog_log_facility = LOG_USER {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.464385] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] tempdir = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.464550] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] timeout_nbd = 10 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.464715] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] transport_url = **** {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.464873] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] update_resources_interval = 0 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.465042] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] use_cow_images = True {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.465204] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] use_eventlog = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.465363] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] use_journal = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.465520] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] use_json = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.465675] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] use_rootwrap_daemon = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.465829] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] use_stderr = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.465987] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] use_syslog = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.466160] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vcpu_pin_set = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.466321] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vif_plugging_is_fatal = True {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.466488] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vif_plugging_timeout = 300 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.466650] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] virt_mkfs = [] {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.466808] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] volume_usage_poll_interval = 0 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.466965] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] watch_log_file = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.467143] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] web = /usr/share/spice-html5 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 575.467330] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_concurrency.disable_process_locking = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.467622] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_concurrency.lock_path = /opt/stack/data/n-cpu-1 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.467804] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_messaging_metrics.metrics_buffer_size = 1000 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.467972] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_messaging_metrics.metrics_enabled = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.468156] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_messaging_metrics.metrics_process_name = {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.468355] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.468528] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.468710] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] api.auth_strategy = keystone {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.468903] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] api.compute_link_prefix = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.469065] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.469259] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] api.dhcp_domain = novalocal {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.469440] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] api.enable_instance_password = True {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.469605] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] api.glance_link_prefix = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.469768] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] api.instance_list_cells_batch_fixed_size = 100 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.469940] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] api.instance_list_cells_batch_strategy = distributed {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.470116] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] api.instance_list_per_project_cells = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.470278] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] api.list_records_by_skipping_down_cells = True {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.470440] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] api.local_metadata_per_cell = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.470607] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] api.max_limit = 1000 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.470771] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] api.metadata_cache_expiration = 15 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.470953] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] api.neutron_default_tenant_id = default {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.471135] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] api.use_forwarded_for = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.471301] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] api.use_neutron_default_nets = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.471472] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] api.vendordata_dynamic_connect_timeout = 5 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.471637] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] api.vendordata_dynamic_failure_fatal = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.471801] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] api.vendordata_dynamic_read_timeout = 5 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.471973] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] api.vendordata_dynamic_ssl_certfile = {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.472161] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] api.vendordata_dynamic_targets = [] {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.472325] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] api.vendordata_jsonfile_path = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.472502] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] api.vendordata_providers = ['StaticJSON'] {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.472691] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cache.backend = dogpile.cache.memcached {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.472857] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cache.backend_argument = **** {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.473036] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cache.config_prefix = cache.oslo {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.473212] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cache.dead_timeout = 60.0 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.473380] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cache.debug_cache_backend = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.473541] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cache.enable_retry_client = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.473701] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cache.enable_socket_keepalive = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.473869] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cache.enabled = True {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.474043] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cache.expiration_time = 600 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.474208] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cache.hashclient_retry_attempts = 2 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.474373] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cache.hashclient_retry_delay = 1.0 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.474532] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cache.memcache_dead_retry = 300 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.474700] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cache.memcache_password = {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.474863] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cache.memcache_pool_connection_get_timeout = 10 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.475033] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cache.memcache_pool_flush_on_reconnect = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.475199] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cache.memcache_pool_maxsize = 10 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.475364] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cache.memcache_pool_unused_timeout = 60 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.475523] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cache.memcache_sasl_enabled = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.475700] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cache.memcache_servers = ['localhost:11211'] {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.475863] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cache.memcache_socket_timeout = 1.0 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.476034] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cache.memcache_username = {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.476235] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cache.proxies = [] {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.476411] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cache.retry_attempts = 2 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.476578] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cache.retry_delay = 0.0 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.476740] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cache.socket_keepalive_count = 1 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.476899] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cache.socket_keepalive_idle = 1 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.477070] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cache.socket_keepalive_interval = 1 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.477230] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cache.tls_allowed_ciphers = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.477386] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cache.tls_cafile = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.477539] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cache.tls_certfile = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.477697] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cache.tls_enabled = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.477850] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cache.tls_keyfile = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.478025] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cinder.auth_section = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.478219] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cinder.auth_type = password {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.478396] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cinder.cafile = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.478574] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cinder.catalog_info = volumev3::publicURL {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.478733] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cinder.certfile = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.478894] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cinder.collect_timing = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.479063] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cinder.cross_az_attach = True {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.479246] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cinder.debug = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.479418] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cinder.endpoint_template = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.479583] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cinder.http_retries = 3 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.479742] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cinder.insecure = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.479900] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cinder.keyfile = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.480081] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cinder.os_region_name = RegionOne {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.480247] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cinder.split_loggers = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.480411] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cinder.timeout = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.480582] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] compute.consecutive_build_service_disable_threshold = 10 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.480742] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] compute.cpu_dedicated_set = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.480900] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] compute.cpu_shared_set = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.481074] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] compute.image_type_exclude_list = [] {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.481240] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] compute.live_migration_wait_for_vif_plug = True {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.481404] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] compute.max_concurrent_disk_ops = 0 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.481565] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] compute.max_disk_devices_to_attach = -1 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.481724] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] compute.packing_host_numa_cells_allocation_strategy = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.481891] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] compute.provider_config_location = /etc/nova/provider_config/ {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.482065] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] compute.resource_provider_association_refresh = 300 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.482232] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] compute.shutdown_retry_interval = 10 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.482411] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.482586] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] conductor.workers = 2 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.482758] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] console.allowed_origins = [] {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.482917] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] console.ssl_ciphers = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.483094] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] console.ssl_minimum_version = default {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.483267] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] consoleauth.token_ttl = 600 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.483441] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cyborg.cafile = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.483600] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cyborg.certfile = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.483761] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cyborg.collect_timing = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.483918] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cyborg.connect_retries = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.484084] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cyborg.connect_retry_delay = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.484241] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cyborg.endpoint_override = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.484403] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cyborg.insecure = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.484563] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cyborg.keyfile = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.484722] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cyborg.max_version = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.484878] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cyborg.min_version = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.485043] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cyborg.region_name = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.485201] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cyborg.service_name = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.485371] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cyborg.service_type = accelerator {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.485530] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cyborg.split_loggers = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.485684] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cyborg.status_code_retries = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.485838] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cyborg.status_code_retry_delay = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.485992] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cyborg.timeout = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.486206] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cyborg.valid_interfaces = ['internal', 'public'] {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.486380] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] cyborg.version = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.486566] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] database.backend = sqlalchemy {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.486746] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] database.connection = **** {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.486914] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] database.connection_debug = 0 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.487097] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] database.connection_parameters = {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.487265] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] database.connection_recycle_time = 3600 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.487434] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] database.connection_trace = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.487621] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] database.db_inc_retry_interval = True {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.487751] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] database.db_max_retries = 20 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.487913] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] database.db_max_retry_interval = 10 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.488086] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] database.db_retry_interval = 1 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.488306] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] database.max_overflow = 50 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.488484] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] database.max_pool_size = 5 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.488657] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] database.max_retries = 10 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.488827] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] database.mysql_sql_mode = TRADITIONAL {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.488984] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] database.mysql_wsrep_sync_wait = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.489775] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] database.pool_timeout = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.489775] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] database.retry_interval = 10 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.489775] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] database.slave_connection = **** {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.489775] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] database.sqlite_synchronous = True {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.489913] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] database.use_db_reconnect = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.489978] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] api_database.backend = sqlalchemy {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.490171] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] api_database.connection = **** {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.490343] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] api_database.connection_debug = 0 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.490517] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] api_database.connection_parameters = {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.490678] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] api_database.connection_recycle_time = 3600 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.490843] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] api_database.connection_trace = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.491011] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] api_database.db_inc_retry_interval = True {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.491183] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] api_database.db_max_retries = 20 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.491345] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] api_database.db_max_retry_interval = 10 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.491508] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] api_database.db_retry_interval = 1 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.491676] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] api_database.max_overflow = 50 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.491837] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] api_database.max_pool_size = 5 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.492006] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] api_database.max_retries = 10 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.492180] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] api_database.mysql_sql_mode = TRADITIONAL {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.492342] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] api_database.mysql_wsrep_sync_wait = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.492504] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] api_database.pool_timeout = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.492671] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] api_database.retry_interval = 10 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.492829] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] api_database.slave_connection = **** {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.492993] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] api_database.sqlite_synchronous = True {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.493180] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] devices.enabled_mdev_types = [] {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.493355] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] ephemeral_storage_encryption.cipher = aes-xts-plain64 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.493519] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] ephemeral_storage_encryption.enabled = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.493686] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] ephemeral_storage_encryption.key_size = 512 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.493853] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] glance.api_servers = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.494026] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] glance.cafile = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.494192] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] glance.certfile = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.494357] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] glance.collect_timing = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.494515] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] glance.connect_retries = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.494672] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] glance.connect_retry_delay = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.494832] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] glance.debug = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.494997] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] glance.default_trusted_certificate_ids = [] {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.495171] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] glance.enable_certificate_validation = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.495331] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] glance.enable_rbd_download = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.495492] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] glance.endpoint_override = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.495655] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] glance.insecure = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.495816] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] glance.keyfile = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.495972] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] glance.max_version = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.496164] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] glance.min_version = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.496336] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] glance.num_retries = 3 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.496506] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] glance.rbd_ceph_conf = {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.496845] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] glance.rbd_connect_timeout = 5 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.496845] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] glance.rbd_pool = {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.496988] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] glance.rbd_user = {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.497160] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] glance.region_name = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.497320] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] glance.service_name = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.497488] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] glance.service_type = image {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.497648] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] glance.split_loggers = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.497806] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] glance.status_code_retries = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.497962] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] glance.status_code_retry_delay = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.498134] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] glance.timeout = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.498342] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] glance.valid_interfaces = ['internal', 'public'] {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.498516] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] glance.verify_glance_signatures = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.498678] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] glance.version = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.498842] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] guestfs.debug = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.499013] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] hyperv.config_drive_cdrom = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.499192] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] hyperv.config_drive_inject_password = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.499376] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] hyperv.dynamic_memory_ratio = 1.0 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.499539] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] hyperv.enable_instance_metrics_collection = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.499700] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] hyperv.enable_remotefx = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.499870] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] hyperv.instances_path_share = {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.500043] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] hyperv.iscsi_initiator_list = [] {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.500229] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] hyperv.limit_cpu_features = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.500409] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] hyperv.mounted_disk_query_retry_count = 10 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.500571] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] hyperv.mounted_disk_query_retry_interval = 5 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.500729] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] hyperv.power_state_check_timeframe = 60 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.500894] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] hyperv.power_state_event_polling_interval = 2 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.501074] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] hyperv.qemu_img_cmd = qemu-img.exe {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.501238] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] hyperv.use_multipath_io = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.501398] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] hyperv.volume_attach_retry_count = 10 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.501555] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] hyperv.volume_attach_retry_interval = 5 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.501709] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] hyperv.vswitch_name = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.501866] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] hyperv.wait_soft_reboot_seconds = 60 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.502039] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] mks.enabled = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.502396] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] mks.mksproxy_base_url = http://127.0.0.1:6090/ {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.502587] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] image_cache.manager_interval = 2400 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.502753] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] image_cache.precache_concurrency = 1 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.502921] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] image_cache.remove_unused_base_images = True {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.503099] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] image_cache.remove_unused_original_minimum_age_seconds = 86400 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.503269] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] image_cache.remove_unused_resized_minimum_age_seconds = 3600 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.503442] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] image_cache.subdirectory_name = _base {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.503613] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] ironic.api_max_retries = 60 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.503773] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] ironic.api_retry_interval = 2 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.503928] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] ironic.auth_section = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.504098] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] ironic.auth_type = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.504259] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] ironic.cafile = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.504467] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] ironic.certfile = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.504654] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] ironic.collect_timing = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.504820] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] ironic.conductor_group = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.504979] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] ironic.connect_retries = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.505155] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] ironic.connect_retry_delay = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.505314] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] ironic.endpoint_override = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.505477] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] ironic.insecure = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.505636] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] ironic.keyfile = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.505793] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] ironic.max_version = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.505950] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] ironic.min_version = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.506135] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] ironic.peer_list = [] {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.506298] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] ironic.region_name = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.506457] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] ironic.serial_console_state_timeout = 10 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.506615] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] ironic.service_name = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.506782] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] ironic.service_type = baremetal {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.506941] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] ironic.split_loggers = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.507111] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] ironic.status_code_retries = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.507267] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] ironic.status_code_retry_delay = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.507426] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] ironic.timeout = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.507605] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] ironic.valid_interfaces = ['internal', 'public'] {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.507766] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] ironic.version = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.507945] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] key_manager.backend = nova.keymgr.conf_key_mgr.ConfKeyManager {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.508131] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] key_manager.fixed_key = **** {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.508340] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] barbican.auth_endpoint = http://localhost/identity/v3 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.508509] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] barbican.barbican_api_version = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.508667] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] barbican.barbican_endpoint = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.508836] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] barbican.barbican_endpoint_type = public {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.508992] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] barbican.barbican_region_name = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.509165] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] barbican.cafile = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.509358] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] barbican.certfile = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.509528] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] barbican.collect_timing = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.509687] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] barbican.insecure = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.509842] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] barbican.keyfile = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.510008] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] barbican.number_of_retries = 60 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.510175] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] barbican.retry_delay = 1 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.510337] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] barbican.send_service_user_token = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.510502] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] barbican.split_loggers = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.510655] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] barbican.timeout = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.510815] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] barbican.verify_ssl = True {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.510970] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] barbican.verify_ssl_path = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.511147] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] barbican_service_user.auth_section = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.511311] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] barbican_service_user.auth_type = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.511467] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] barbican_service_user.cafile = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.511622] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] barbican_service_user.certfile = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.511782] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] barbican_service_user.collect_timing = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.511939] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] barbican_service_user.insecure = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.512107] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] barbican_service_user.keyfile = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.512292] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] barbican_service_user.split_loggers = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.512464] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] barbican_service_user.timeout = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.512632] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vault.approle_role_id = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.512789] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vault.approle_secret_id = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.512947] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vault.cafile = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.513121] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vault.certfile = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.513285] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vault.collect_timing = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.513496] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vault.insecure = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.513681] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vault.keyfile = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.513854] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vault.kv_mountpoint = secret {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.514194] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vault.kv_path = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.514194] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vault.kv_version = 2 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.514340] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vault.namespace = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.514501] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vault.root_token_id = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.514655] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vault.split_loggers = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.514812] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vault.ssl_ca_crt_file = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.514970] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vault.timeout = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.515145] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vault.use_ssl = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.515316] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vault.vault_url = http://127.0.0.1:8200 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.515483] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] keystone.auth_section = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.515644] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] keystone.auth_type = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.515802] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] keystone.cafile = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.515960] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] keystone.certfile = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.516154] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] keystone.collect_timing = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.516326] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] keystone.connect_retries = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.516484] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] keystone.connect_retry_delay = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.516637] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] keystone.endpoint_override = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.516794] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] keystone.insecure = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.516947] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] keystone.keyfile = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.517113] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] keystone.max_version = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.517269] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] keystone.min_version = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.517427] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] keystone.region_name = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.517580] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] keystone.service_name = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.517747] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] keystone.service_type = identity {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.517906] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] keystone.split_loggers = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.518069] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] keystone.status_code_retries = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.518256] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] keystone.status_code_retry_delay = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.518425] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] keystone.timeout = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.518605] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] keystone.valid_interfaces = ['internal', 'public'] {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.518765] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] keystone.version = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.518964] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.connection_uri = {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.519137] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.cpu_mode = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.519327] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.cpu_model_extra_flags = [] {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.519500] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.cpu_models = [] {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.519669] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.cpu_power_governor_high = performance {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.519834] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.cpu_power_governor_low = powersave {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.519994] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.cpu_power_management = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.520179] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.cpu_power_management_strategy = cpu_state {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.520342] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.device_detach_attempts = 8 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.520506] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.device_detach_timeout = 20 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.520667] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.disk_cachemodes = [] {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.520827] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.disk_prefix = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.520992] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.enabled_perf_events = [] {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.521171] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.file_backed_memory = 0 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.521337] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.gid_maps = [] {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.521496] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.hw_disk_discard = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.521653] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.hw_machine_type = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.521819] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.images_rbd_ceph_conf = {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.521986] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.images_rbd_glance_copy_poll_interval = 15 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.522167] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.images_rbd_glance_copy_timeout = 600 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.522338] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.images_rbd_glance_store_name = {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.522509] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.images_rbd_pool = rbd {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.522678] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.images_type = default {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.522836] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.images_volume_group = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.522994] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.inject_key = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.523166] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.inject_partition = -2 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.523324] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.inject_password = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.523482] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.iscsi_iface = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.523638] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.iser_use_multipath = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.523798] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.live_migration_bandwidth = 0 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.523959] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.live_migration_completion_timeout = 800 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.524134] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.live_migration_downtime = 500 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.524323] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.live_migration_downtime_delay = 75 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.524503] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.live_migration_downtime_steps = 10 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.524668] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.live_migration_inbound_addr = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.524833] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.live_migration_permit_auto_converge = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.524992] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.live_migration_permit_post_copy = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.525165] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.live_migration_scheme = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.525335] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.live_migration_timeout_action = abort {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.525495] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.live_migration_tunnelled = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.525651] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.live_migration_uri = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.525810] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.live_migration_with_native_tls = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.525965] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.max_queues = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.526147] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.mem_stats_period_seconds = 10 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.526308] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.nfs_mount_options = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.526635] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.nfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.526807] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.num_aoe_discover_tries = 3 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.526970] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.num_iser_scan_tries = 5 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.527152] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.num_memory_encrypted_guests = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.527321] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.num_nvme_discover_tries = 5 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.527490] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.num_pcie_ports = 0 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.527655] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.num_volume_scan_tries = 5 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.527821] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.pmem_namespaces = [] {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.527981] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.quobyte_client_cfg = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.528310] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.quobyte_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.528494] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.rbd_connect_timeout = 5 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.528662] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.rbd_destroy_volume_retries = 12 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.528828] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.rbd_destroy_volume_retry_interval = 5 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.528992] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.rbd_secret_uuid = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.529167] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.rbd_user = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.529359] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.realtime_scheduler_priority = 1 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.529537] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.remote_filesystem_transport = ssh {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.529699] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.rescue_image_id = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.529857] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.rescue_kernel_id = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.530019] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.rescue_ramdisk_id = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.530191] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.rng_dev_path = /dev/urandom {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.530352] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.rx_queue_size = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.530516] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.smbfs_mount_options = {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.530795] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.smbfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.530968] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.snapshot_compression = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.531143] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.snapshot_image_format = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.531363] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.snapshots_directory = /opt/stack/data/nova/instances/snapshots {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.531532] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.sparse_logical_volumes = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.531694] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.swtpm_enabled = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.531863] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.swtpm_group = tss {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.532040] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.swtpm_user = tss {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.532212] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.sysinfo_serial = unique {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.532371] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.tb_cache_size = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.532528] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.tx_queue_size = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.532690] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.uid_maps = [] {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.532853] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.use_virtio_for_bridges = True {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.533032] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.virt_type = kvm {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.533207] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.volume_clear = zero {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.533375] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.volume_clear_size = 0 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.533545] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.volume_use_multipath = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.533698] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.vzstorage_cache_path = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.533866] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.534042] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.vzstorage_mount_group = qemu {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.534211] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.vzstorage_mount_opts = [] {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.534381] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.vzstorage_mount_perms = 0770 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.534651] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.vzstorage_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.534827] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.vzstorage_mount_user = stack {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.534995] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] libvirt.wait_soft_reboot_seconds = 120 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.535184] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] neutron.auth_section = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.535362] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] neutron.auth_type = password {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.535526] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] neutron.cafile = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.535688] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] neutron.certfile = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.535851] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] neutron.collect_timing = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.536025] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] neutron.connect_retries = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.536220] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] neutron.connect_retry_delay = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.536403] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] neutron.default_floating_pool = public {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.536564] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] neutron.endpoint_override = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.536732] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] neutron.extension_sync_interval = 600 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.536892] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] neutron.http_retries = 3 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.537064] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] neutron.insecure = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.537231] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] neutron.keyfile = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.537393] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] neutron.max_version = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.537565] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] neutron.metadata_proxy_shared_secret = **** {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.537727] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] neutron.min_version = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.537896] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] neutron.ovs_bridge = br-int {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.538072] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] neutron.physnets = [] {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.538269] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] neutron.region_name = RegionOne {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.538452] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] neutron.service_metadata_proxy = True {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.538616] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] neutron.service_name = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.538784] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] neutron.service_type = network {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.538945] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] neutron.split_loggers = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.539115] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] neutron.status_code_retries = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.539302] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] neutron.status_code_retry_delay = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.539468] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] neutron.timeout = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.539648] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] neutron.valid_interfaces = ['internal', 'public'] {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.539810] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] neutron.version = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.539982] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] notifications.bdms_in_notifications = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.540176] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] notifications.default_level = INFO {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.540356] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] notifications.notification_format = unversioned {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.540525] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] notifications.notify_on_state_change = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.540698] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] notifications.versioned_notifications_topics = ['versioned_notifications'] {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.540875] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] pci.alias = [] {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.541056] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] pci.device_spec = [] {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.541225] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] pci.report_in_placement = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.541397] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] placement.auth_section = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.541571] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] placement.auth_type = password {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.541739] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] placement.auth_url = http://10.180.1.21/identity {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.541901] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] placement.cafile = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.542070] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] placement.certfile = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.542237] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] placement.collect_timing = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.542399] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] placement.connect_retries = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.542559] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] placement.connect_retry_delay = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.542721] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] placement.default_domain_id = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.542879] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] placement.default_domain_name = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.543048] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] placement.domain_id = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.543211] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] placement.domain_name = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.543371] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] placement.endpoint_override = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.543532] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] placement.insecure = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.543690] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] placement.keyfile = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.543847] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] placement.max_version = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.544011] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] placement.min_version = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.544189] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] placement.password = **** {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.544349] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] placement.project_domain_id = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.544518] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] placement.project_domain_name = Default {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.544684] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] placement.project_id = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.544858] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] placement.project_name = service {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.545035] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] placement.region_name = RegionOne {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.545203] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] placement.service_name = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.545374] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] placement.service_type = placement {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.545537] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] placement.split_loggers = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.545695] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] placement.status_code_retries = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.545855] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] placement.status_code_retry_delay = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.546027] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] placement.system_scope = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.546216] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] placement.timeout = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.546386] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] placement.trust_id = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.546546] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] placement.user_domain_id = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.546715] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] placement.user_domain_name = Default {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.546875] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] placement.user_id = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.547059] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] placement.username = placement {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.547244] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] placement.valid_interfaces = ['internal', 'public'] {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.547407] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] placement.version = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.547583] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] quota.cores = 20 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.547749] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] quota.count_usage_from_placement = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.547924] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] quota.driver = nova.quota.DbQuotaDriver {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.548108] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] quota.injected_file_content_bytes = 10240 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.548353] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] quota.injected_file_path_length = 255 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.548540] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] quota.injected_files = 5 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.548712] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] quota.instances = 10 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.548879] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] quota.key_pairs = 100 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.549059] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] quota.metadata_items = 128 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.549246] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] quota.ram = 51200 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.549423] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] quota.recheck_quota = True {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.549590] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] quota.server_group_members = 10 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.549754] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] quota.server_groups = 10 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.549924] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] rdp.enabled = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.550265] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.550454] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] scheduler.discover_hosts_in_cells_interval = -1 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.550624] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] scheduler.enable_isolated_aggregate_filtering = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.550789] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] scheduler.image_metadata_prefilter = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.550952] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] scheduler.limit_tenants_to_placement_aggregate = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.551129] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] scheduler.max_attempts = 3 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.551297] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] scheduler.max_placement_results = 1000 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.551461] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] scheduler.placement_aggregate_required_for_tenants = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.551622] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] scheduler.query_placement_for_image_type_support = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.551783] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] scheduler.query_placement_for_routed_network_aggregates = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.551954] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] scheduler.workers = 2 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.552147] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] filter_scheduler.aggregate_image_properties_isolation_namespace = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.552320] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] filter_scheduler.aggregate_image_properties_isolation_separator = . {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.552501] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.552673] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] filter_scheduler.build_failure_weight_multiplier = 1000000.0 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.552838] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] filter_scheduler.cpu_weight_multiplier = 1.0 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.553006] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.553178] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] filter_scheduler.disk_weight_multiplier = 1.0 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.553366] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter', 'SameHostFilter', 'DifferentHostFilter'] {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.553536] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] filter_scheduler.host_subset_size = 1 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.553700] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] filter_scheduler.hypervisor_version_weight_multiplier = 1.0 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.553859] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] filter_scheduler.image_properties_default_architecture = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.554029] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] filter_scheduler.io_ops_weight_multiplier = -1.0 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.554200] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] filter_scheduler.isolated_hosts = [] {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.554367] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] filter_scheduler.isolated_images = [] {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.554532] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] filter_scheduler.max_instances_per_host = 50 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.554694] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] filter_scheduler.max_io_ops_per_host = 8 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.554859] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] filter_scheduler.num_instances_weight_multiplier = 0.0 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.555032] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] filter_scheduler.pci_in_placement = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.555201] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] filter_scheduler.pci_weight_multiplier = 1.0 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.555363] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] filter_scheduler.ram_weight_multiplier = 1.0 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.555530] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.555692] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] filter_scheduler.shuffle_best_same_weighed_hosts = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.555855] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] filter_scheduler.soft_affinity_weight_multiplier = 1.0 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.556028] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.556207] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] filter_scheduler.track_instance_changes = True {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.556388] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.556559] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] metrics.required = True {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.556725] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] metrics.weight_multiplier = 1.0 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.556909] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] metrics.weight_of_unavailable = -10000.0 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.557061] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] metrics.weight_setting = [] {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.557364] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] serial_console.base_url = ws://127.0.0.1:6083/ {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.557540] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] serial_console.enabled = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.557718] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] serial_console.port_range = 10000:20000 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.557890] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] serial_console.proxyclient_address = 127.0.0.1 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.558072] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] serial_console.serialproxy_host = 0.0.0.0 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.558267] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] serial_console.serialproxy_port = 6083 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.558449] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] service_user.auth_section = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.558629] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] service_user.auth_type = password {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.558791] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] service_user.cafile = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.558950] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] service_user.certfile = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.559129] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] service_user.collect_timing = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.559328] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] service_user.insecure = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.559502] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] service_user.keyfile = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.559690] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] service_user.send_service_user_token = True {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.559856] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] service_user.split_loggers = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.560024] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] service_user.timeout = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.560198] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] spice.agent_enabled = True {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.560362] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] spice.enabled = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.560653] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.560844] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] spice.html5proxy_host = 0.0.0.0 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.561021] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] spice.html5proxy_port = 6082 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.561186] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] spice.image_compression = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.561346] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] spice.jpeg_compression = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.561503] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] spice.playback_compression = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.561669] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] spice.server_listen = 127.0.0.1 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.561836] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] spice.server_proxyclient_address = 127.0.0.1 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.561995] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] spice.streaming_mode = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.562168] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] spice.zlib_compression = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.562336] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] upgrade_levels.baseapi = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.562495] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] upgrade_levels.cert = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.562664] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] upgrade_levels.compute = auto {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.562821] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] upgrade_levels.conductor = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.562976] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] upgrade_levels.scheduler = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.563159] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vendordata_dynamic_auth.auth_section = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.563327] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vendordata_dynamic_auth.auth_type = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.563488] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vendordata_dynamic_auth.cafile = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.563648] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vendordata_dynamic_auth.certfile = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.563810] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vendordata_dynamic_auth.collect_timing = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.563970] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vendordata_dynamic_auth.insecure = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.564139] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vendordata_dynamic_auth.keyfile = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.564301] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vendordata_dynamic_auth.split_loggers = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.564460] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vendordata_dynamic_auth.timeout = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.564634] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vmware.api_retry_count = 10 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.564795] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vmware.ca_file = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.564965] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vmware.cache_prefix = devstack-image-cache {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.565145] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vmware.cluster_name = testcl1 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.565312] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vmware.connection_pool_size = 10 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.565490] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vmware.console_delay_seconds = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.565711] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vmware.datastore_regex = ^datastore.* {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.565931] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vmware.host_ip = vc1.osci.c.eu-de-1.cloud.sap {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.566129] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vmware.host_password = **** {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.566299] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vmware.host_port = 443 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.566470] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vmware.host_username = administrator@vsphere.local {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.566640] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vmware.insecure = True {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.566803] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vmware.integration_bridge = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.567021] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vmware.maximum_objects = 100 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.567136] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vmware.pbm_default_policy = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.567299] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vmware.pbm_enabled = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.567459] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vmware.pbm_wsdl_location = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.567626] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vmware.serial_log_dir = /opt/vmware/vspc {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.567784] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vmware.serial_port_proxy_uri = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.567942] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vmware.serial_port_service_uri = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.568121] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vmware.task_poll_interval = 0.5 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.568324] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vmware.use_linked_clone = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.568504] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vmware.vnc_keymap = en-us {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.568672] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vmware.vnc_port = 5900 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.568838] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vmware.vnc_port_total = 10000 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.569041] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vnc.auth_schemes = ['none'] {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.569253] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vnc.enabled = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.569563] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vnc.novncproxy_base_url = http://127.0.0.1:6080/vnc_auto.html {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.569752] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vnc.novncproxy_host = 0.0.0.0 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.569927] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vnc.novncproxy_port = 6080 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.570122] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vnc.server_listen = 127.0.0.1 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.570301] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vnc.server_proxyclient_address = 127.0.0.1 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.570466] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vnc.vencrypt_ca_certs = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.570626] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vnc.vencrypt_client_cert = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.570784] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vnc.vencrypt_client_key = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.570961] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] workarounds.disable_compute_service_check_for_ffu = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.571138] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] workarounds.disable_deep_image_inspection = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.571302] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] workarounds.disable_fallback_pcpu_query = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.571465] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] workarounds.disable_group_policy_check_upcall = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.571623] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] workarounds.disable_libvirt_livesnapshot = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.571783] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] workarounds.disable_rootwrap = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.571944] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] workarounds.enable_numa_live_migration = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.572117] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] workarounds.enable_qemu_monitor_announce_self = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.572282] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.572443] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] workarounds.handle_virt_lifecycle_events = True {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.572601] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] workarounds.libvirt_disable_apic = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.572758] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] workarounds.never_download_image_if_on_rbd = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.572919] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] workarounds.qemu_monitor_announce_self_count = 3 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.573090] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] workarounds.qemu_monitor_announce_self_interval = 1 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.573252] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] workarounds.reserve_disk_resource_for_image_cache = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.573416] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] workarounds.skip_cpu_compare_at_startup = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.573576] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] workarounds.skip_cpu_compare_on_dest = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.573735] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] workarounds.skip_hypervisor_version_check_on_lm = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.573893] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] workarounds.skip_reserve_in_use_ironic_nodes = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.574064] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] workarounds.unified_limits_count_pcpu_as_vcpu = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.574233] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.574420] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] wsgi.api_paste_config = /etc/nova/api-paste.ini {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.574591] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] wsgi.client_socket_timeout = 900 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.574760] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] wsgi.default_pool_size = 1000 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.574927] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] wsgi.keep_alive = True {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.575106] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] wsgi.max_header_line = 16384 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.575271] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] wsgi.secure_proxy_ssl_header = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.575434] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] wsgi.ssl_ca_file = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.575594] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] wsgi.ssl_cert_file = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.575752] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] wsgi.ssl_key_file = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.575916] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] wsgi.tcp_keepidle = 600 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.576101] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.576273] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] zvm.ca_file = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.576438] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] zvm.cloud_connector_url = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.576725] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] zvm.image_tmp_path = /opt/stack/data/n-cpu-1/images {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.576897] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] zvm.reachable_timeout = 300 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.577093] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_policy.enforce_new_defaults = True {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.577267] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_policy.enforce_scope = True {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.577445] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_policy.policy_default_rule = default {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.577625] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_policy.policy_dirs = ['policy.d'] {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.577799] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_policy.policy_file = policy.yaml {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.577970] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_policy.remote_content_type = application/x-www-form-urlencoded {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.578145] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_policy.remote_ssl_ca_crt_file = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.578336] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_policy.remote_ssl_client_crt_file = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.578501] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_policy.remote_ssl_client_key_file = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.578672] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_policy.remote_ssl_verify_server_crt = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.578847] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_versionedobjects.fatal_exception_format_errors = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.579028] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.579229] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] profiler.connection_string = messaging:// {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.579413] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] profiler.enabled = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.579585] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] profiler.es_doc_type = notification {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.579749] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] profiler.es_scroll_size = 10000 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.579919] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] profiler.es_scroll_time = 2m {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.580094] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] profiler.filter_error_trace = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.580264] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] profiler.hmac_keys = **** {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.580431] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] profiler.sentinel_service_name = mymaster {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.580597] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] profiler.socket_timeout = 0.1 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.580758] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] profiler.trace_requests = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.580916] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] profiler.trace_sqlalchemy = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.581105] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] profiler_jaeger.process_tags = {} {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.581269] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] profiler_jaeger.service_name_prefix = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.581431] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] profiler_otlp.service_name_prefix = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.581597] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] remote_debug.host = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.581754] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] remote_debug.port = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.581930] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_messaging_rabbit.amqp_auto_delete = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.582104] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_messaging_rabbit.amqp_durable_queues = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.582270] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_messaging_rabbit.conn_pool_min_size = 2 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.582433] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_messaging_rabbit.conn_pool_ttl = 1200 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.582596] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_messaging_rabbit.direct_mandatory_flag = True {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.582756] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_messaging_rabbit.enable_cancel_on_failover = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.582915] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_messaging_rabbit.heartbeat_in_pthread = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.583087] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_messaging_rabbit.heartbeat_rate = 2 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.583253] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.583411] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_messaging_rabbit.kombu_compression = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.583578] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_messaging_rabbit.kombu_failover_strategy = round-robin {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.583744] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.583909] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.584085] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_messaging_rabbit.rabbit_ha_queues = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.584252] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_messaging_rabbit.rabbit_interval_max = 30 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.584425] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.584586] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.584744] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.584908] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.585084] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.585245] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_messaging_rabbit.rabbit_quorum_queue = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.585409] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_messaging_rabbit.rabbit_retry_backoff = 2 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.585566] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_messaging_rabbit.rabbit_retry_interval = 1 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.585724] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.585890] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_messaging_rabbit.rpc_conn_pool_size = 30 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.586068] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_messaging_rabbit.ssl = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.586245] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_messaging_rabbit.ssl_ca_file = {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.586417] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_messaging_rabbit.ssl_cert_file = {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.586579] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_messaging_rabbit.ssl_enforce_fips_mode = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.586747] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_messaging_rabbit.ssl_key_file = {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.586913] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_messaging_rabbit.ssl_version = {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.587116] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_messaging_notifications.driver = ['messagingv2'] {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.587281] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_messaging_notifications.retry = -1 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.587468] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_messaging_notifications.topics = ['notifications'] {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.587638] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_messaging_notifications.transport_url = **** {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.587809] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_limit.auth_section = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.587974] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_limit.auth_type = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.588145] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_limit.cafile = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.588346] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_limit.certfile = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.588523] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_limit.collect_timing = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.588682] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_limit.connect_retries = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.588840] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_limit.connect_retry_delay = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.588998] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_limit.endpoint_id = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.589174] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_limit.endpoint_override = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.589352] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_limit.insecure = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.589513] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_limit.keyfile = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.589670] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_limit.max_version = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.589823] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_limit.min_version = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.589977] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_limit.region_name = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.590147] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_limit.service_name = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.590305] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_limit.service_type = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.590466] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_limit.split_loggers = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.590624] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_limit.status_code_retries = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.590781] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_limit.status_code_retry_delay = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.590936] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_limit.timeout = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.591103] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_limit.valid_interfaces = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.591262] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_limit.version = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.591428] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_reports.file_event_handler = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.591591] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_reports.file_event_handler_interval = 1 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.591747] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] oslo_reports.log_dir = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.591916] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vif_plug_linux_bridge_privileged.capabilities = [12] {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.592090] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vif_plug_linux_bridge_privileged.group = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.592253] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vif_plug_linux_bridge_privileged.helper_command = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.592420] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.592584] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vif_plug_linux_bridge_privileged.thread_pool_size = 8 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.592743] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vif_plug_linux_bridge_privileged.user = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.592911] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vif_plug_ovs_privileged.capabilities = [12, 1] {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.593080] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vif_plug_ovs_privileged.group = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.593241] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vif_plug_ovs_privileged.helper_command = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.593405] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.593565] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vif_plug_ovs_privileged.thread_pool_size = 8 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.593720] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] vif_plug_ovs_privileged.user = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.593887] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] os_vif_linux_bridge.flat_interface = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.594076] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] os_vif_linux_bridge.forward_bridge_interface = ['all'] {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.594253] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] os_vif_linux_bridge.iptables_bottom_regex = {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.594424] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] os_vif_linux_bridge.iptables_drop_action = DROP {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.594593] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] os_vif_linux_bridge.iptables_top_regex = {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.594760] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] os_vif_linux_bridge.network_device_mtu = 1500 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.594921] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] os_vif_linux_bridge.use_ipv6 = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.595092] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] os_vif_linux_bridge.vlan_interface = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.595271] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] os_vif_ovs.default_qos_type = linux-noop {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.595443] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] os_vif_ovs.isolate_vif = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.595609] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] os_vif_ovs.network_device_mtu = 1500 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.595773] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] os_vif_ovs.ovs_vsctl_timeout = 120 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.595940] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.596118] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] os_vif_ovs.ovsdb_interface = native {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.596281] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] os_vif_ovs.per_port_bridge = False {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.596446] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] os_brick.lock_path = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.596609] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] os_brick.wait_mpath_device_attempts = 4 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.596769] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] os_brick.wait_mpath_device_interval = 1 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.596937] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] privsep_osbrick.capabilities = [21] {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.597108] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] privsep_osbrick.group = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.597293] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] privsep_osbrick.helper_command = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.597427] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] privsep_osbrick.logger_name = os_brick.privileged {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.597592] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] privsep_osbrick.thread_pool_size = 8 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.597748] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] privsep_osbrick.user = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.597917] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.598084] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] nova_sys_admin.group = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.598260] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] nova_sys_admin.helper_command = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.598432] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] nova_sys_admin.logger_name = oslo_privsep.daemon {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.598593] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] nova_sys_admin.thread_pool_size = 8 {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.598748] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] nova_sys_admin.user = None {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 575.598880] env[68906]: DEBUG oslo_service.service [None req-172c210e-016a-48e0-9610-170b14f3f7b9 None None] ******************************************************************************** {{(pid=68906) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2613}} [ 575.599338] env[68906]: INFO nova.service [-] Starting compute node (version 0.0.1) [ 575.610421] env[68906]: WARNING nova.virt.vmwareapi.driver [None req-0544d0bd-c80c-45cb-84ad-6ae010405171 None None] The vmwareapi driver is not tested by the OpenStack project nor does it have clear maintainer(s) and thus its quality can not be ensured. It should be considered experimental and may be removed in a future release. If you are using the driver in production please let us know via the openstack-discuss mailing list. [ 575.610894] env[68906]: INFO nova.virt.node [None req-0544d0bd-c80c-45cb-84ad-6ae010405171 None None] Generated node identity 1119f6db-bfd7-4ef3-bdff-5c6974dc249b [ 575.611149] env[68906]: INFO nova.virt.node [None req-0544d0bd-c80c-45cb-84ad-6ae010405171 None None] Wrote node identity 1119f6db-bfd7-4ef3-bdff-5c6974dc249b to /opt/stack/data/n-cpu-1/compute_id [ 575.624659] env[68906]: WARNING nova.compute.manager [None req-0544d0bd-c80c-45cb-84ad-6ae010405171 None None] Compute nodes ['1119f6db-bfd7-4ef3-bdff-5c6974dc249b'] for host cpu-1 were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning. [ 575.659117] env[68906]: INFO nova.compute.manager [None req-0544d0bd-c80c-45cb-84ad-6ae010405171 None None] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host [ 575.680334] env[68906]: WARNING nova.compute.manager [None req-0544d0bd-c80c-45cb-84ad-6ae010405171 None None] No compute node record found for host cpu-1. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host cpu-1 could not be found. [ 575.680575] env[68906]: DEBUG oslo_concurrency.lockutils [None req-0544d0bd-c80c-45cb-84ad-6ae010405171 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 575.680802] env[68906]: DEBUG oslo_concurrency.lockutils [None req-0544d0bd-c80c-45cb-84ad-6ae010405171 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 575.680948] env[68906]: DEBUG oslo_concurrency.lockutils [None req-0544d0bd-c80c-45cb-84ad-6ae010405171 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 575.681114] env[68906]: DEBUG nova.compute.resource_tracker [None req-0544d0bd-c80c-45cb-84ad-6ae010405171 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68906) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 575.682241] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dfb2b15c-eadb-43f1-85d2-4d57ddfd9382 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 575.691026] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-85e931cd-487e-41e1-bbd2-3b21dd75848a {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 575.704724] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c2e0590d-530f-4a2f-876f-9db66e8af707 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 575.711016] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c35848d1-bb95-4961-8432-54eea1ad3a75 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 575.741120] env[68906]: DEBUG nova.compute.resource_tracker [None req-0544d0bd-c80c-45cb-84ad-6ae010405171 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180970MB free_disk=93GB free_vcpus=48 pci_devices=None {{(pid=68906) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 575.741264] env[68906]: DEBUG oslo_concurrency.lockutils [None req-0544d0bd-c80c-45cb-84ad-6ae010405171 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 575.741420] env[68906]: DEBUG oslo_concurrency.lockutils [None req-0544d0bd-c80c-45cb-84ad-6ae010405171 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 575.752699] env[68906]: WARNING nova.compute.resource_tracker [None req-0544d0bd-c80c-45cb-84ad-6ae010405171 None None] No compute node record for cpu-1:1119f6db-bfd7-4ef3-bdff-5c6974dc249b: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 1119f6db-bfd7-4ef3-bdff-5c6974dc249b could not be found. [ 575.765947] env[68906]: INFO nova.compute.resource_tracker [None req-0544d0bd-c80c-45cb-84ad-6ae010405171 None None] Compute node record created for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 with uuid: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b [ 575.817372] env[68906]: DEBUG nova.compute.resource_tracker [None req-0544d0bd-c80c-45cb-84ad-6ae010405171 None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 575.817543] env[68906]: DEBUG nova.compute.resource_tracker [None req-0544d0bd-c80c-45cb-84ad-6ae010405171 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 575.920293] env[68906]: INFO nova.scheduler.client.report [None req-0544d0bd-c80c-45cb-84ad-6ae010405171 None None] [req-63c31f7d-35c2-41d6-bdb5-df9443ce928c] Created resource provider record via placement API for resource provider with UUID 1119f6db-bfd7-4ef3-bdff-5c6974dc249b and name domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28. [ 575.936239] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ebec4ba4-f4bb-4f40-ba7a-4acdb7747799 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 575.943892] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-81bd79bb-be02-4b34-94e5-0941472e2f71 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 575.973139] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d7620fde-ea6b-40ec-8f56-463df1f3c6b3 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 575.980609] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f614abfc-b783-4263-97f5-1af18305fa98 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 575.993521] env[68906]: DEBUG nova.compute.provider_tree [None req-0544d0bd-c80c-45cb-84ad-6ae010405171 None None] Updating inventory in ProviderTree for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 576.029584] env[68906]: DEBUG nova.scheduler.client.report [None req-0544d0bd-c80c-45cb-84ad-6ae010405171 None None] Updated inventory for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b with generation 0 in Placement from set_inventory_for_provider using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:957}} [ 576.029840] env[68906]: DEBUG nova.compute.provider_tree [None req-0544d0bd-c80c-45cb-84ad-6ae010405171 None None] Updating resource provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b generation from 0 to 1 during operation: update_inventory {{(pid=68906) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 576.030036] env[68906]: DEBUG nova.compute.provider_tree [None req-0544d0bd-c80c-45cb-84ad-6ae010405171 None None] Updating inventory in ProviderTree for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 576.077008] env[68906]: DEBUG nova.compute.provider_tree [None req-0544d0bd-c80c-45cb-84ad-6ae010405171 None None] Updating resource provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b generation from 1 to 2 during operation: update_traits {{(pid=68906) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 576.094116] env[68906]: DEBUG nova.compute.resource_tracker [None req-0544d0bd-c80c-45cb-84ad-6ae010405171 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68906) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 576.094300] env[68906]: DEBUG oslo_concurrency.lockutils [None req-0544d0bd-c80c-45cb-84ad-6ae010405171 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.353s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 576.094456] env[68906]: DEBUG nova.service [None req-0544d0bd-c80c-45cb-84ad-6ae010405171 None None] Creating RPC server for service compute {{(pid=68906) start /opt/stack/nova/nova/service.py:182}} [ 576.108043] env[68906]: DEBUG nova.service [None req-0544d0bd-c80c-45cb-84ad-6ae010405171 None None] Join ServiceGroup membership for this service compute {{(pid=68906) start /opt/stack/nova/nova/service.py:199}} [ 576.108270] env[68906]: DEBUG nova.servicegroup.drivers.db [None req-0544d0bd-c80c-45cb-84ad-6ae010405171 None None] DB_Driver: join new ServiceGroup member cpu-1 to the compute group, service = {{(pid=68906) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} [ 585.438423] env[68906]: DEBUG dbcounter [-] [68906] Writing DB stats nova_cell0:SELECT=1 {{(pid=68906) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:115}} [ 585.439443] env[68906]: DEBUG dbcounter [-] [68906] Writing DB stats nova_cell1:SELECT=1 {{(pid=68906) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:115}} [ 610.109725] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._sync_power_states {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 610.124382] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Getting list of instances from cluster (obj){ [ 610.124382] env[68906]: value = "domain-c8" [ 610.124382] env[68906]: _type = "ClusterComputeResource" [ 610.124382] env[68906]: } {{(pid=68906) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 610.126708] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9c87d91f-33d4-4f37-832a-d98bbc1e01b4 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 610.137103] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Got total of 0 instances {{(pid=68906) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 610.137103] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 610.137103] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Getting list of instances from cluster (obj){ [ 610.137103] env[68906]: value = "domain-c8" [ 610.137103] env[68906]: _type = "ClusterComputeResource" [ 610.137103] env[68906]: } {{(pid=68906) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 610.138819] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ba53197-7a39-44dd-a50f-dca80f559d96 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 610.148678] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Got total of 0 instances {{(pid=68906) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 618.592653] env[68906]: DEBUG oslo_concurrency.lockutils [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Acquiring lock "57feb127-36f1-403c-bbca-7054286c1972" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 618.592653] env[68906]: DEBUG oslo_concurrency.lockutils [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Lock "57feb127-36f1-403c-bbca-7054286c1972" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 618.622997] env[68906]: DEBUG nova.compute.manager [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] [instance: 57feb127-36f1-403c-bbca-7054286c1972] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 618.776748] env[68906]: DEBUG oslo_concurrency.lockutils [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 618.777046] env[68906]: DEBUG oslo_concurrency.lockutils [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 618.778688] env[68906]: INFO nova.compute.claims [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] [instance: 57feb127-36f1-403c-bbca-7054286c1972] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 618.932937] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f7b99489-7331-4211-8f2b-9746eb707aed {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 618.946602] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d9d50e98-33b8-4bf7-bbaa-2ccdbe0f2673 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 618.993156] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-81096800-141c-4eb9-a858-bb65dc209acc {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 619.007912] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-addda8fe-1fc0-43dc-91ee-07e66a18be9b {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 619.023211] env[68906]: DEBUG nova.compute.provider_tree [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 619.043451] env[68906]: DEBUG nova.scheduler.client.report [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 619.076815] env[68906]: DEBUG oslo_concurrency.lockutils [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.300s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 619.077750] env[68906]: DEBUG nova.compute.manager [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] [instance: 57feb127-36f1-403c-bbca-7054286c1972] Start building networks asynchronously for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 619.148097] env[68906]: DEBUG nova.compute.utils [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Using /dev/sd instead of None {{(pid=68906) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 619.149699] env[68906]: DEBUG nova.compute.manager [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] [instance: 57feb127-36f1-403c-bbca-7054286c1972] Allocating IP information in the background. {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 619.149924] env[68906]: DEBUG nova.network.neutron [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] [instance: 57feb127-36f1-403c-bbca-7054286c1972] allocate_for_instance() {{(pid=68906) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 619.176050] env[68906]: DEBUG nova.compute.manager [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] [instance: 57feb127-36f1-403c-bbca-7054286c1972] Start building block device mappings for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 619.301435] env[68906]: DEBUG nova.compute.manager [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] [instance: 57feb127-36f1-403c-bbca-7054286c1972] Start spawning the instance on the hypervisor. {{(pid=68906) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 619.426134] env[68906]: DEBUG nova.virt.hardware [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T13:00:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T13:00:23Z,direct_url=,disk_format='vmdk',id=b1400c31-d33b-4e13-944f-4c645e62493e,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='1ae7bf3a375d41c6af5e7536af51ffd1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T13:00:24Z,virtual_size=,visibility=), allow threads: False {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 619.426403] env[68906]: DEBUG nova.virt.hardware [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Flavor limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 619.426558] env[68906]: DEBUG nova.virt.hardware [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Image limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 619.426730] env[68906]: DEBUG nova.virt.hardware [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Flavor pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 619.426873] env[68906]: DEBUG nova.virt.hardware [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Image pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 619.427040] env[68906]: DEBUG nova.virt.hardware [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 619.427259] env[68906]: DEBUG nova.virt.hardware [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 619.427413] env[68906]: DEBUG nova.virt.hardware [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 619.427761] env[68906]: DEBUG nova.virt.hardware [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Got 1 possible topologies {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 619.427926] env[68906]: DEBUG nova.virt.hardware [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 619.428144] env[68906]: DEBUG nova.virt.hardware [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 619.429099] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-082801bd-eb1c-46c9-9e61-6c0b44112363 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 619.438108] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-39a0ae39-50a9-4cfa-8853-a69c074646bf {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 619.455753] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28b3bb49-fd17-4b9b-95df-9d139f696b64 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 620.003413] env[68906]: DEBUG nova.policy [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9fb52d396bef4197bdd563b6c102260a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'deb4a90ce69f466d9e9f1611d4797b62', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68906) authorize /opt/stack/nova/nova/policy.py:203}} [ 620.857740] env[68906]: DEBUG oslo_concurrency.lockutils [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Acquiring lock "e2ee8d01-b1d3-4bde-81ae-668ffeef42b0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 620.857740] env[68906]: DEBUG oslo_concurrency.lockutils [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Lock "e2ee8d01-b1d3-4bde-81ae-668ffeef42b0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 620.889771] env[68906]: DEBUG nova.compute.manager [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 621.021810] env[68906]: DEBUG oslo_concurrency.lockutils [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 621.022097] env[68906]: DEBUG oslo_concurrency.lockutils [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 621.023990] env[68906]: INFO nova.compute.claims [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 621.225898] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca3593f2-fbef-4aa6-8cbc-52cf8edb7ecf {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 621.241955] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-798321d8-fbc9-4168-ba3c-698d9243b050 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 621.285481] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d73e743-324e-42aa-ab4c-e67e21e58beb {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 621.294260] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ab57acf-9bc5-46e4-9709-25a8053951b5 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 621.310265] env[68906]: DEBUG nova.compute.provider_tree [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 621.321200] env[68906]: DEBUG nova.scheduler.client.report [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 621.345750] env[68906]: DEBUG oslo_concurrency.lockutils [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.324s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 621.346373] env[68906]: DEBUG nova.compute.manager [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] Start building networks asynchronously for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 621.418631] env[68906]: DEBUG nova.compute.utils [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Using /dev/sd instead of None {{(pid=68906) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 621.419333] env[68906]: DEBUG nova.compute.manager [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] Allocating IP information in the background. {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 621.419708] env[68906]: DEBUG nova.network.neutron [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] allocate_for_instance() {{(pid=68906) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 621.446798] env[68906]: DEBUG nova.compute.manager [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] Start building block device mappings for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 621.547254] env[68906]: DEBUG nova.compute.manager [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] Start spawning the instance on the hypervisor. {{(pid=68906) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 621.586246] env[68906]: DEBUG nova.virt.hardware [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T13:00:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T13:00:23Z,direct_url=,disk_format='vmdk',id=b1400c31-d33b-4e13-944f-4c645e62493e,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='1ae7bf3a375d41c6af5e7536af51ffd1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T13:00:24Z,virtual_size=,visibility=), allow threads: False {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 621.586246] env[68906]: DEBUG nova.virt.hardware [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Flavor limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 621.586246] env[68906]: DEBUG nova.virt.hardware [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Image limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 621.586948] env[68906]: DEBUG nova.virt.hardware [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Flavor pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 621.587127] env[68906]: DEBUG nova.virt.hardware [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Image pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 621.587296] env[68906]: DEBUG nova.virt.hardware [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 621.587450] env[68906]: DEBUG nova.virt.hardware [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 621.587628] env[68906]: DEBUG nova.virt.hardware [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 621.587822] env[68906]: DEBUG nova.virt.hardware [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Got 1 possible topologies {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 621.588414] env[68906]: DEBUG nova.virt.hardware [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 621.588414] env[68906]: DEBUG nova.virt.hardware [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 621.589395] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0852a7bd-a645-4d51-bf1c-d8286e8132a4 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 621.599570] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-efa473cf-7be0-4068-bf04-bc20df3939b6 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 622.047555] env[68906]: DEBUG nova.policy [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5f5e4c1ff56142759c151e74ccc25745', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c3383efc94034ed9a8b2f7c73934e3c3', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68906) authorize /opt/stack/nova/nova/policy.py:203}} [ 623.666347] env[68906]: DEBUG nova.network.neutron [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] [instance: 57feb127-36f1-403c-bbca-7054286c1972] Successfully created port: 6c63bf97-aa05-4edb-95ea-4046e0527259 {{(pid=68906) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 626.024213] env[68906]: DEBUG nova.network.neutron [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] Successfully created port: 9bdacd11-d4fc-4e1f-aac2-e4d8d097bac8 {{(pid=68906) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 627.470063] env[68906]: DEBUG nova.network.neutron [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] [instance: 57feb127-36f1-403c-bbca-7054286c1972] Successfully updated port: 6c63bf97-aa05-4edb-95ea-4046e0527259 {{(pid=68906) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 627.491868] env[68906]: DEBUG oslo_concurrency.lockutils [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Acquiring lock "refresh_cache-57feb127-36f1-403c-bbca-7054286c1972" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 627.492035] env[68906]: DEBUG oslo_concurrency.lockutils [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Acquired lock "refresh_cache-57feb127-36f1-403c-bbca-7054286c1972" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 627.492862] env[68906]: DEBUG nova.network.neutron [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] [instance: 57feb127-36f1-403c-bbca-7054286c1972] Building network info cache for instance {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 627.595110] env[68906]: DEBUG nova.network.neutron [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] [instance: 57feb127-36f1-403c-bbca-7054286c1972] Instance cache missing network info. {{(pid=68906) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 629.023601] env[68906]: DEBUG nova.network.neutron [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] [instance: 57feb127-36f1-403c-bbca-7054286c1972] Updating instance_info_cache with network_info: [{"id": "6c63bf97-aa05-4edb-95ea-4046e0527259", "address": "fa:16:3e:fa:b4:2c", "network": {"id": "63efabfb-0028-4758-9626-5f9860440121", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.104", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "1ae7bf3a375d41c6af5e7536af51ffd1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "69054a13-b7ef-44e1-bd3b-3ca5ba602848", "external-id": "nsx-vlan-transportzone-153", "segmentation_id": 153, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6c63bf97-aa", "ovs_interfaceid": "6c63bf97-aa05-4edb-95ea-4046e0527259", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 629.047082] env[68906]: DEBUG oslo_concurrency.lockutils [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Releasing lock "refresh_cache-57feb127-36f1-403c-bbca-7054286c1972" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 629.047458] env[68906]: DEBUG nova.compute.manager [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] [instance: 57feb127-36f1-403c-bbca-7054286c1972] Instance network_info: |[{"id": "6c63bf97-aa05-4edb-95ea-4046e0527259", "address": "fa:16:3e:fa:b4:2c", "network": {"id": "63efabfb-0028-4758-9626-5f9860440121", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.104", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "1ae7bf3a375d41c6af5e7536af51ffd1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "69054a13-b7ef-44e1-bd3b-3ca5ba602848", "external-id": "nsx-vlan-transportzone-153", "segmentation_id": 153, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6c63bf97-aa", "ovs_interfaceid": "6c63bf97-aa05-4edb-95ea-4046e0527259", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 629.047954] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] [instance: 57feb127-36f1-403c-bbca-7054286c1972] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:fa:b4:2c', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '69054a13-b7ef-44e1-bd3b-3ca5ba602848', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '6c63bf97-aa05-4edb-95ea-4046e0527259', 'vif_model': 'vmxnet3'}] {{(pid=68906) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 629.075736] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Creating folder: OpenStack. Parent ref: group-v4. {{(pid=68906) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 629.076856] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-ecb6f5af-355c-48a3-91ef-0ccf18a21134 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 629.092784] env[68906]: INFO nova.virt.vmwareapi.vm_util [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Created folder: OpenStack in parent group-v4. [ 629.094543] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Creating folder: Project (deb4a90ce69f466d9e9f1611d4797b62). Parent ref: group-v694750. {{(pid=68906) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 629.094543] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7181433a-3d66-4dc1-a59c-07eacfcd5ce3 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 629.102506] env[68906]: INFO nova.virt.vmwareapi.vm_util [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Created folder: Project (deb4a90ce69f466d9e9f1611d4797b62) in parent group-v694750. [ 629.102733] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Creating folder: Instances. Parent ref: group-v694751. {{(pid=68906) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 629.103242] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b78c164e-966d-4e28-b871-1e78637d89c4 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 629.112371] env[68906]: INFO nova.virt.vmwareapi.vm_util [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Created folder: Instances in parent group-v694751. [ 629.112695] env[68906]: DEBUG oslo.service.loopingcall [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 629.112946] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 57feb127-36f1-403c-bbca-7054286c1972] Creating VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 629.113193] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-bd7cf5cf-5d32-4a11-86d0-ceee4cfb2c47 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 629.137138] env[68906]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 629.137138] env[68906]: value = "task-3475254" [ 629.137138] env[68906]: _type = "Task" [ 629.137138] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 629.149032] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475254, 'name': CreateVM_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 629.624254] env[68906]: DEBUG nova.network.neutron [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] Successfully updated port: 9bdacd11-d4fc-4e1f-aac2-e4d8d097bac8 {{(pid=68906) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 629.643656] env[68906]: DEBUG oslo_concurrency.lockutils [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Acquiring lock "refresh_cache-e2ee8d01-b1d3-4bde-81ae-668ffeef42b0" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 629.643656] env[68906]: DEBUG oslo_concurrency.lockutils [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Acquired lock "refresh_cache-e2ee8d01-b1d3-4bde-81ae-668ffeef42b0" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 629.643656] env[68906]: DEBUG nova.network.neutron [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] Building network info cache for instance {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 629.653105] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475254, 'name': CreateVM_Task, 'duration_secs': 0.402567} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 629.654325] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 57feb127-36f1-403c-bbca-7054286c1972] Created VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 629.721194] env[68906]: DEBUG oslo_vmware.service [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b6f7577b-4879-4f1e-be31-5dba03ff88b4 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 629.737101] env[68906]: DEBUG oslo_concurrency.lockutils [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 629.737282] env[68906]: DEBUG oslo_concurrency.lockutils [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 629.738107] env[68906]: DEBUG oslo_concurrency.lockutils [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 629.738303] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ec05366d-fd20-42e6-9c45-e2145cf3eb00 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 629.748572] env[68906]: DEBUG oslo_vmware.api [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Waiting for the task: (returnval){ [ 629.748572] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]522f167c-92e4-e6e5-5754-a5dbb6724c37" [ 629.748572] env[68906]: _type = "Task" [ 629.748572] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 629.764416] env[68906]: DEBUG oslo_vmware.api [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]522f167c-92e4-e6e5-5754-a5dbb6724c37, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 629.808860] env[68906]: DEBUG nova.network.neutron [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] Instance cache missing network info. {{(pid=68906) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 630.260962] env[68906]: DEBUG oslo_concurrency.lockutils [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 630.261854] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] [instance: 57feb127-36f1-403c-bbca-7054286c1972] Processing image b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 630.262159] env[68906]: DEBUG oslo_concurrency.lockutils [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 630.262274] env[68906]: DEBUG oslo_concurrency.lockutils [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 630.262687] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 630.262938] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-76731590-cd44-411b-aecb-5e67e7e8fe15 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 630.270996] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 630.271198] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68906) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 630.272110] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-45c84668-5f35-47ec-b7a5-643db4662706 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 630.278666] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c14b0dfc-8116-41e0-a7c5-d40cc42df850 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 630.283450] env[68906]: DEBUG oslo_vmware.api [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Waiting for the task: (returnval){ [ 630.283450] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]52e0b8b2-e236-3cfd-7bad-fb9c0d529692" [ 630.283450] env[68906]: _type = "Task" [ 630.283450] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 630.295535] env[68906]: DEBUG oslo_vmware.api [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]52e0b8b2-e236-3cfd-7bad-fb9c0d529692, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 630.757838] env[68906]: DEBUG nova.network.neutron [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] Updating instance_info_cache with network_info: [{"id": "9bdacd11-d4fc-4e1f-aac2-e4d8d097bac8", "address": "fa:16:3e:9a:6f:46", "network": {"id": "63efabfb-0028-4758-9626-5f9860440121", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.206", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "1ae7bf3a375d41c6af5e7536af51ffd1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "69054a13-b7ef-44e1-bd3b-3ca5ba602848", "external-id": "nsx-vlan-transportzone-153", "segmentation_id": 153, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9bdacd11-d4", "ovs_interfaceid": "9bdacd11-d4fc-4e1f-aac2-e4d8d097bac8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 630.772469] env[68906]: DEBUG oslo_concurrency.lockutils [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Releasing lock "refresh_cache-e2ee8d01-b1d3-4bde-81ae-668ffeef42b0" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 630.773064] env[68906]: DEBUG nova.compute.manager [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] Instance network_info: |[{"id": "9bdacd11-d4fc-4e1f-aac2-e4d8d097bac8", "address": "fa:16:3e:9a:6f:46", "network": {"id": "63efabfb-0028-4758-9626-5f9860440121", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.206", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "1ae7bf3a375d41c6af5e7536af51ffd1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "69054a13-b7ef-44e1-bd3b-3ca5ba602848", "external-id": "nsx-vlan-transportzone-153", "segmentation_id": 153, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9bdacd11-d4", "ovs_interfaceid": "9bdacd11-d4fc-4e1f-aac2-e4d8d097bac8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 630.773680] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:9a:6f:46', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '69054a13-b7ef-44e1-bd3b-3ca5ba602848', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '9bdacd11-d4fc-4e1f-aac2-e4d8d097bac8', 'vif_model': 'vmxnet3'}] {{(pid=68906) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 630.786308] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Creating folder: Project (c3383efc94034ed9a8b2f7c73934e3c3). Parent ref: group-v694750. {{(pid=68906) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 630.787103] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f020a675-61ef-48d9-b26b-daba9b4b18fe {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 630.805732] env[68906]: INFO nova.virt.vmwareapi.vm_util [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Created folder: Project (c3383efc94034ed9a8b2f7c73934e3c3) in parent group-v694750. [ 630.806139] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Creating folder: Instances. Parent ref: group-v694754. {{(pid=68906) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 630.810297] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-6655f5c5-1a77-43f7-86a3-953fa4d17a49 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 630.812464] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] [instance: 57feb127-36f1-403c-bbca-7054286c1972] Preparing fetch location {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 630.812872] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Creating directory with path [datastore2] vmware_temp/984d8c72-4a3f-4db8-8df6-23da021ebc2f/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 630.813371] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-74807cb1-7ac4-405a-8c39-947fa64cf1c6 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 630.821362] env[68906]: INFO nova.virt.vmwareapi.vm_util [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Created folder: Instances in parent group-v694754. [ 630.821362] env[68906]: DEBUG oslo.service.loopingcall [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 630.821362] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] Creating VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 630.821362] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-8d9f1e34-5c01-4d2e-82e0-1b265c07f717 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 630.841057] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Created directory with path [datastore2] vmware_temp/984d8c72-4a3f-4db8-8df6-23da021ebc2f/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 630.841226] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] [instance: 57feb127-36f1-403c-bbca-7054286c1972] Fetch image to [datastore2] vmware_temp/984d8c72-4a3f-4db8-8df6-23da021ebc2f/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 630.841394] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] [instance: 57feb127-36f1-403c-bbca-7054286c1972] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to [datastore2] vmware_temp/984d8c72-4a3f-4db8-8df6-23da021ebc2f/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 630.843845] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8858f13b-1376-42d9-93fd-aa4a330c7c77 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 630.847031] env[68906]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 630.847031] env[68906]: value = "task-3475257" [ 630.847031] env[68906]: _type = "Task" [ 630.847031] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 630.858593] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1bba62c7-bf0f-47fa-8b21-d24741464da5 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 630.862068] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475257, 'name': CreateVM_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 630.873104] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e62a17be-4a9a-48c5-a161-050233cfdf51 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 630.922506] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b9e014d1-b56e-4660-be75-93a06d9d97f0 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 630.929551] env[68906]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-ef647957-ac66-4c8f-bf22-362e945288fc {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 630.968541] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] [instance: 57feb127-36f1-403c-bbca-7054286c1972] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 631.059709] env[68906]: DEBUG oslo_vmware.rw_handles [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/984d8c72-4a3f-4db8-8df6-23da021ebc2f/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 631.131595] env[68906]: DEBUG oslo_vmware.rw_handles [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Completed reading data from the image iterator. {{(pid=68906) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 631.131595] env[68906]: DEBUG oslo_vmware.rw_handles [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/984d8c72-4a3f-4db8-8df6-23da021ebc2f/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 631.359424] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475257, 'name': CreateVM_Task, 'duration_secs': 0.365402} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 631.359774] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] Created VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 631.360293] env[68906]: DEBUG oslo_concurrency.lockutils [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 631.360475] env[68906]: DEBUG oslo_concurrency.lockutils [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 631.361246] env[68906]: DEBUG oslo_concurrency.lockutils [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 631.361246] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9a942dca-1f3d-4bd1-a8cd-8891055856f5 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 631.366114] env[68906]: DEBUG oslo_vmware.api [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Waiting for the task: (returnval){ [ 631.366114] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]528e0ac7-c63c-47ae-f0eb-ab7346f0219a" [ 631.366114] env[68906]: _type = "Task" [ 631.366114] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 631.374589] env[68906]: DEBUG oslo_vmware.api [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]528e0ac7-c63c-47ae-f0eb-ab7346f0219a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 631.515539] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Acquiring lock "46481a4e-ac53-456d-b6cb-9f3ffbccf407" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 631.515771] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Lock "46481a4e-ac53-456d-b6cb-9f3ffbccf407" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 631.529149] env[68906]: DEBUG nova.compute.manager [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 631.556516] env[68906]: DEBUG oslo_concurrency.lockutils [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Acquiring lock "d2258ded-478a-4530-b940-386286702048" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 631.556779] env[68906]: DEBUG oslo_concurrency.lockutils [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Lock "d2258ded-478a-4530-b940-386286702048" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 631.568218] env[68906]: DEBUG nova.compute.manager [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] [instance: d2258ded-478a-4530-b940-386286702048] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 631.608138] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 631.608407] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 631.610182] env[68906]: INFO nova.compute.claims [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 631.630801] env[68906]: DEBUG oslo_concurrency.lockutils [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 631.755333] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0431a3b5-cf57-4aa0-a67c-3c61be08fe09 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 631.764876] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-59259c62-306f-431e-b13d-36cc525b1bcf {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 631.795363] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2765fcdf-bc61-437c-83f8-6e1b4380d8f8 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 631.803335] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dcc358a3-c001-456f-93d0-1517b2150c8c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 631.818389] env[68906]: DEBUG nova.compute.provider_tree [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 631.836989] env[68906]: DEBUG nova.scheduler.client.report [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 631.865762] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.257s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 631.866864] env[68906]: DEBUG nova.compute.manager [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] Start building networks asynchronously for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 631.870370] env[68906]: DEBUG oslo_concurrency.lockutils [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.241s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 631.876684] env[68906]: INFO nova.compute.claims [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] [instance: d2258ded-478a-4530-b940-386286702048] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 631.896610] env[68906]: DEBUG oslo_concurrency.lockutils [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 631.896856] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] Processing image b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 631.897063] env[68906]: DEBUG oslo_concurrency.lockutils [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 631.951295] env[68906]: DEBUG nova.compute.utils [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Using /dev/sd instead of None {{(pid=68906) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 631.953711] env[68906]: DEBUG nova.compute.manager [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] Allocating IP information in the background. {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 631.954071] env[68906]: DEBUG nova.network.neutron [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] allocate_for_instance() {{(pid=68906) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 631.975068] env[68906]: DEBUG nova.compute.manager [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] Start building block device mappings for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 632.059167] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d17f6d6d-aa06-4fad-8e81-87d6a5b80552 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 632.069187] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b68d70e5-7ca3-4521-99ce-8f6d09e53a2d {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 632.101742] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8756f832-dbf3-4470-9fdd-7445f7d17b18 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 632.105139] env[68906]: DEBUG nova.compute.manager [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] Start spawning the instance on the hypervisor. {{(pid=68906) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 632.113400] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6567739a-e782-4ec0-a156-1d578fd03f0c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 632.133792] env[68906]: DEBUG nova.compute.provider_tree [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 632.141765] env[68906]: DEBUG nova.compute.manager [req-c76a8dcc-788f-4984-8136-4456af4294c9 req-3824feaf-4d11-4810-b78c-a57a7cb02c01 service nova] [instance: 57feb127-36f1-403c-bbca-7054286c1972] Received event network-vif-plugged-6c63bf97-aa05-4edb-95ea-4046e0527259 {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 632.141982] env[68906]: DEBUG oslo_concurrency.lockutils [req-c76a8dcc-788f-4984-8136-4456af4294c9 req-3824feaf-4d11-4810-b78c-a57a7cb02c01 service nova] Acquiring lock "57feb127-36f1-403c-bbca-7054286c1972-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 632.142224] env[68906]: DEBUG oslo_concurrency.lockutils [req-c76a8dcc-788f-4984-8136-4456af4294c9 req-3824feaf-4d11-4810-b78c-a57a7cb02c01 service nova] Lock "57feb127-36f1-403c-bbca-7054286c1972-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 632.142372] env[68906]: DEBUG oslo_concurrency.lockutils [req-c76a8dcc-788f-4984-8136-4456af4294c9 req-3824feaf-4d11-4810-b78c-a57a7cb02c01 service nova] Lock "57feb127-36f1-403c-bbca-7054286c1972-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 632.142530] env[68906]: DEBUG nova.compute.manager [req-c76a8dcc-788f-4984-8136-4456af4294c9 req-3824feaf-4d11-4810-b78c-a57a7cb02c01 service nova] [instance: 57feb127-36f1-403c-bbca-7054286c1972] No waiting events found dispatching network-vif-plugged-6c63bf97-aa05-4edb-95ea-4046e0527259 {{(pid=68906) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 632.142684] env[68906]: WARNING nova.compute.manager [req-c76a8dcc-788f-4984-8136-4456af4294c9 req-3824feaf-4d11-4810-b78c-a57a7cb02c01 service nova] [instance: 57feb127-36f1-403c-bbca-7054286c1972] Received unexpected event network-vif-plugged-6c63bf97-aa05-4edb-95ea-4046e0527259 for instance with vm_state building and task_state spawning. [ 632.147449] env[68906]: DEBUG nova.virt.hardware [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T13:00:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T13:00:23Z,direct_url=,disk_format='vmdk',id=b1400c31-d33b-4e13-944f-4c645e62493e,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='1ae7bf3a375d41c6af5e7536af51ffd1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T13:00:24Z,virtual_size=,visibility=), allow threads: False {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 632.147660] env[68906]: DEBUG nova.virt.hardware [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Flavor limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 632.151463] env[68906]: DEBUG nova.virt.hardware [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Image limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 632.151463] env[68906]: DEBUG nova.virt.hardware [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Flavor pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 632.151463] env[68906]: DEBUG nova.virt.hardware [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Image pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 632.151463] env[68906]: DEBUG nova.virt.hardware [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 632.151463] env[68906]: DEBUG nova.virt.hardware [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 632.151715] env[68906]: DEBUG nova.virt.hardware [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 632.151715] env[68906]: DEBUG nova.virt.hardware [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Got 1 possible topologies {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 632.151715] env[68906]: DEBUG nova.virt.hardware [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 632.151715] env[68906]: DEBUG nova.virt.hardware [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 632.151715] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b18d0694-7b6b-4c62-ba5a-9134eaed0555 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 632.154083] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 632.154635] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 632.154820] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Starting heal instance info cache {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 632.154939] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Rebuilding the list of instances to heal {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 632.160072] env[68906]: DEBUG nova.scheduler.client.report [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 632.167053] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-27ab5e22-a0b4-41c9-a144-14f813c68503 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 632.184327] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 57feb127-36f1-403c-bbca-7054286c1972] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 632.184473] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 632.184600] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 632.184868] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: d2258ded-478a-4530-b940-386286702048] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 632.184868] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Didn't find any instances for network info cache update. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 632.185285] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 632.185523] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 632.185703] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 632.185880] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 632.186083] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 632.186263] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 632.186508] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68906) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 632.186665] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager.update_available_resource {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 632.188279] env[68906]: DEBUG oslo_concurrency.lockutils [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.318s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 632.188719] env[68906]: DEBUG nova.compute.manager [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] [instance: d2258ded-478a-4530-b940-386286702048] Start building networks asynchronously for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 632.205486] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 632.205723] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 632.205887] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 632.206324] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68906) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 632.207246] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-432acf31-568a-46f9-af2d-c1bbda470c83 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 632.216763] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e0000852-05c0-4820-bcbd-173be03132c6 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 632.232461] env[68906]: DEBUG nova.policy [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cc6bd4c2b2764a43991261d773efe282', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '43130cf9c64b405a9e5e0094228c72de', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68906) authorize /opt/stack/nova/nova/policy.py:203}} [ 632.234415] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-606140cc-7a34-40ef-8a19-2eb2eca2754a {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 632.243241] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7262cf53-5424-47a4-8f19-8c4f1181b07e {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 632.258197] env[68906]: DEBUG nova.compute.utils [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Using /dev/sd instead of None {{(pid=68906) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 632.284411] env[68906]: DEBUG nova.compute.manager [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] [instance: d2258ded-478a-4530-b940-386286702048] Allocating IP information in the background. {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 632.286023] env[68906]: DEBUG nova.network.neutron [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] [instance: d2258ded-478a-4530-b940-386286702048] allocate_for_instance() {{(pid=68906) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 632.287585] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180960MB free_disk=93GB free_vcpus=48 pci_devices=None {{(pid=68906) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 632.287725] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 632.287911] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 632.289822] env[68906]: DEBUG nova.compute.manager [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] [instance: d2258ded-478a-4530-b940-386286702048] Start building block device mappings for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 632.376915] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 57feb127-36f1-403c-bbca-7054286c1972 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 632.377206] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance e2ee8d01-b1d3-4bde-81ae-668ffeef42b0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 632.377245] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 46481a4e-ac53-456d-b6cb-9f3ffbccf407 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 632.377620] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance d2258ded-478a-4530-b940-386286702048 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 632.377620] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Total usable vcpus: 48, total allocated vcpus: 4 {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 632.377731] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1024MB phys_disk=200GB used_disk=4GB total_vcpus=48 used_vcpus=4 pci_stats=[] {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 632.403027] env[68906]: DEBUG nova.compute.manager [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] [instance: d2258ded-478a-4530-b940-386286702048] Start spawning the instance on the hypervisor. {{(pid=68906) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 632.436036] env[68906]: DEBUG nova.virt.hardware [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T13:00:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T13:00:23Z,direct_url=,disk_format='vmdk',id=b1400c31-d33b-4e13-944f-4c645e62493e,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='1ae7bf3a375d41c6af5e7536af51ffd1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T13:00:24Z,virtual_size=,visibility=), allow threads: False {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 632.436281] env[68906]: DEBUG nova.virt.hardware [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Flavor limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 632.436438] env[68906]: DEBUG nova.virt.hardware [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Image limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 632.436803] env[68906]: DEBUG nova.virt.hardware [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Flavor pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 632.437012] env[68906]: DEBUG nova.virt.hardware [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Image pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 632.437454] env[68906]: DEBUG nova.virt.hardware [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 632.437454] env[68906]: DEBUG nova.virt.hardware [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 632.437545] env[68906]: DEBUG nova.virt.hardware [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 632.437850] env[68906]: DEBUG nova.virt.hardware [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Got 1 possible topologies {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 632.437945] env[68906]: DEBUG nova.virt.hardware [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 632.438557] env[68906]: DEBUG nova.virt.hardware [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 632.439392] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-752b520f-3645-4acd-bdd3-8dba513f864f {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 632.452312] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-08ce86c5-bae7-42dc-8144-4fc02ef0dd1e {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 632.479024] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c5c863dc-34ec-4a64-8cec-06473d9a1606 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 632.485482] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a9e7e873-bf29-4d69-9ed7-01b0a1074506 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 632.516127] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2dbd8d33-40b5-4c48-a2ed-5c4d65fbeb56 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 632.523599] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6bb1986-1017-4490-ae8e-1fbd9a894d5c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 632.537275] env[68906]: DEBUG nova.compute.provider_tree [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 632.546286] env[68906]: DEBUG nova.scheduler.client.report [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 632.563216] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68906) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 632.564478] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.276s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 632.664967] env[68906]: DEBUG nova.policy [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a4167e4aeb2d4144b8d7185be617a3ac', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7d38ee4d80bc4d1f87f4cb1f3907fb54', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68906) authorize /opt/stack/nova/nova/policy.py:203}} [ 633.128847] env[68906]: DEBUG oslo_concurrency.lockutils [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Acquiring lock "da0c4340-a657-43bd-9a98-4c8f50add720" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 633.129283] env[68906]: DEBUG oslo_concurrency.lockutils [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Lock "da0c4340-a657-43bd-9a98-4c8f50add720" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 633.145622] env[68906]: DEBUG nova.compute.manager [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] [instance: da0c4340-a657-43bd-9a98-4c8f50add720] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 633.218662] env[68906]: DEBUG oslo_concurrency.lockutils [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 633.218992] env[68906]: DEBUG oslo_concurrency.lockutils [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 633.221262] env[68906]: INFO nova.compute.claims [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] [instance: da0c4340-a657-43bd-9a98-4c8f50add720] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 633.413406] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7fb6a78a-97a0-4395-a477-94b21a9714d2 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 633.425719] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c65dd565-2550-4c76-a28e-7e9da6ae3711 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 633.462149] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-257a2071-2355-4811-bd3c-01dd84e48119 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 633.470685] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d50862e-fc2c-4fad-9109-50949dae62be {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 633.485631] env[68906]: DEBUG nova.compute.provider_tree [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 633.496760] env[68906]: DEBUG nova.scheduler.client.report [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 633.528527] env[68906]: DEBUG oslo_concurrency.lockutils [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.309s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 633.529271] env[68906]: DEBUG nova.compute.manager [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] [instance: da0c4340-a657-43bd-9a98-4c8f50add720] Start building networks asynchronously for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 633.575089] env[68906]: DEBUG nova.compute.utils [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Using /dev/sd instead of None {{(pid=68906) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 633.581777] env[68906]: DEBUG nova.compute.manager [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] [instance: da0c4340-a657-43bd-9a98-4c8f50add720] Allocating IP information in the background. {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 633.581968] env[68906]: DEBUG nova.network.neutron [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] [instance: da0c4340-a657-43bd-9a98-4c8f50add720] allocate_for_instance() {{(pid=68906) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 633.596882] env[68906]: DEBUG nova.compute.manager [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] [instance: da0c4340-a657-43bd-9a98-4c8f50add720] Start building block device mappings for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 633.706382] env[68906]: DEBUG nova.compute.manager [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] [instance: da0c4340-a657-43bd-9a98-4c8f50add720] Start spawning the instance on the hypervisor. {{(pid=68906) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 633.743478] env[68906]: DEBUG nova.virt.hardware [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T13:00:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T13:00:23Z,direct_url=,disk_format='vmdk',id=b1400c31-d33b-4e13-944f-4c645e62493e,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='1ae7bf3a375d41c6af5e7536af51ffd1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T13:00:24Z,virtual_size=,visibility=), allow threads: False {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 633.743478] env[68906]: DEBUG nova.virt.hardware [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Flavor limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 633.743478] env[68906]: DEBUG nova.virt.hardware [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Image limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 633.743767] env[68906]: DEBUG nova.virt.hardware [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Flavor pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 633.743767] env[68906]: DEBUG nova.virt.hardware [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Image pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 633.743897] env[68906]: DEBUG nova.virt.hardware [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 633.743985] env[68906]: DEBUG nova.virt.hardware [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 633.744162] env[68906]: DEBUG nova.virt.hardware [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 633.745382] env[68906]: DEBUG nova.virt.hardware [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Got 1 possible topologies {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 633.745617] env[68906]: DEBUG nova.virt.hardware [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 633.745796] env[68906]: DEBUG nova.virt.hardware [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 633.747518] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b3adb69-5022-424d-85fb-4e5a5aadfaae {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 633.757924] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e2286d99-245c-4838-a6b7-3fd4365544fd {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 634.248165] env[68906]: DEBUG nova.policy [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '962b82fa90eb4fceb91c7b9dc13a1c25', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3686ca37c45c46e1bcc96a7abe3b5fa0', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68906) authorize /opt/stack/nova/nova/policy.py:203}} [ 634.804537] env[68906]: DEBUG nova.network.neutron [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] Successfully created port: 25488db1-53a4-49f8-9b01-2243a420bdbb {{(pid=68906) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 635.223366] env[68906]: DEBUG nova.network.neutron [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] [instance: d2258ded-478a-4530-b940-386286702048] Successfully created port: aa68d5f0-3302-4547-984f-1d67a625a3ca {{(pid=68906) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 637.093379] env[68906]: DEBUG nova.network.neutron [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] [instance: da0c4340-a657-43bd-9a98-4c8f50add720] Successfully created port: eb5a0594-e1fd-4786-8944-e71fc85436cb {{(pid=68906) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 637.530546] env[68906]: DEBUG oslo_concurrency.lockutils [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Acquiring lock "0540a4dc-1b86-4776-b633-f540af168a2b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 637.530788] env[68906]: DEBUG oslo_concurrency.lockutils [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Lock "0540a4dc-1b86-4776-b633-f540af168a2b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 637.547233] env[68906]: DEBUG nova.compute.manager [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 637.608713] env[68906]: DEBUG oslo_concurrency.lockutils [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 637.608987] env[68906]: DEBUG oslo_concurrency.lockutils [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 637.610697] env[68906]: INFO nova.compute.claims [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 637.793346] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4532de17-d620-45b8-91c7-e2cf2457e700 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 637.802612] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca20877c-9333-4ccf-bbcb-18e3dd1b4e32 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 637.840895] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce64adeb-d03f-47a4-bf57-1442de47e203 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 637.851472] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fa3e4ffe-c96c-4514-b16f-d904fdaaa406 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 637.870375] env[68906]: DEBUG nova.compute.provider_tree [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 637.881423] env[68906]: DEBUG nova.scheduler.client.report [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 637.898450] env[68906]: DEBUG oslo_concurrency.lockutils [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.289s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 637.899032] env[68906]: DEBUG nova.compute.manager [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] Start building networks asynchronously for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 637.941759] env[68906]: DEBUG nova.compute.utils [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Using /dev/sd instead of None {{(pid=68906) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 637.944577] env[68906]: DEBUG nova.compute.manager [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] Allocating IP information in the background. {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 637.944998] env[68906]: DEBUG nova.network.neutron [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] allocate_for_instance() {{(pid=68906) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 637.959886] env[68906]: DEBUG nova.compute.manager [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] Start building block device mappings for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 638.049447] env[68906]: DEBUG nova.compute.manager [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] Start spawning the instance on the hypervisor. {{(pid=68906) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 638.083936] env[68906]: DEBUG nova.virt.hardware [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T13:00:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T13:00:23Z,direct_url=,disk_format='vmdk',id=b1400c31-d33b-4e13-944f-4c645e62493e,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='1ae7bf3a375d41c6af5e7536af51ffd1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T13:00:24Z,virtual_size=,visibility=), allow threads: False {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 638.086038] env[68906]: DEBUG nova.virt.hardware [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Flavor limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 638.086038] env[68906]: DEBUG nova.virt.hardware [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Image limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 638.086038] env[68906]: DEBUG nova.virt.hardware [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Flavor pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 638.086038] env[68906]: DEBUG nova.virt.hardware [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Image pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 638.086038] env[68906]: DEBUG nova.virt.hardware [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 638.086287] env[68906]: DEBUG nova.virt.hardware [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 638.086287] env[68906]: DEBUG nova.virt.hardware [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 638.086287] env[68906]: DEBUG nova.virt.hardware [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Got 1 possible topologies {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 638.086287] env[68906]: DEBUG nova.virt.hardware [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 638.086287] env[68906]: DEBUG nova.virt.hardware [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 638.087010] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f1959755-ca49-432c-a069-a5b3e49cf123 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 638.098134] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-13c0030d-bf6b-428c-ae83-1b8e414ffea3 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 638.306308] env[68906]: DEBUG nova.policy [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1735eb54ec6e4ea682a4d19161f32409', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5c2305d3a9b443f29494e5e234e0f492', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68906) authorize /opt/stack/nova/nova/policy.py:203}} [ 638.733632] env[68906]: DEBUG nova.network.neutron [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] Successfully updated port: 25488db1-53a4-49f8-9b01-2243a420bdbb {{(pid=68906) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 638.747301] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Acquiring lock "refresh_cache-46481a4e-ac53-456d-b6cb-9f3ffbccf407" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 638.750023] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Acquired lock "refresh_cache-46481a4e-ac53-456d-b6cb-9f3ffbccf407" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 638.750023] env[68906]: DEBUG nova.network.neutron [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] Building network info cache for instance {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 638.806303] env[68906]: DEBUG nova.compute.manager [req-a39e41a0-9f24-4cc0-945f-af6777119471 req-4201beb8-2596-430a-b61e-dbea076041cd service nova] [instance: 57feb127-36f1-403c-bbca-7054286c1972] Received event network-changed-6c63bf97-aa05-4edb-95ea-4046e0527259 {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 638.806303] env[68906]: DEBUG nova.compute.manager [req-a39e41a0-9f24-4cc0-945f-af6777119471 req-4201beb8-2596-430a-b61e-dbea076041cd service nova] [instance: 57feb127-36f1-403c-bbca-7054286c1972] Refreshing instance network info cache due to event network-changed-6c63bf97-aa05-4edb-95ea-4046e0527259. {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 638.806303] env[68906]: DEBUG oslo_concurrency.lockutils [req-a39e41a0-9f24-4cc0-945f-af6777119471 req-4201beb8-2596-430a-b61e-dbea076041cd service nova] Acquiring lock "refresh_cache-57feb127-36f1-403c-bbca-7054286c1972" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 638.806303] env[68906]: DEBUG oslo_concurrency.lockutils [req-a39e41a0-9f24-4cc0-945f-af6777119471 req-4201beb8-2596-430a-b61e-dbea076041cd service nova] Acquired lock "refresh_cache-57feb127-36f1-403c-bbca-7054286c1972" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 638.806303] env[68906]: DEBUG nova.network.neutron [req-a39e41a0-9f24-4cc0-945f-af6777119471 req-4201beb8-2596-430a-b61e-dbea076041cd service nova] [instance: 57feb127-36f1-403c-bbca-7054286c1972] Refreshing network info cache for port 6c63bf97-aa05-4edb-95ea-4046e0527259 {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 638.941122] env[68906]: DEBUG nova.network.neutron [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] Instance cache missing network info. {{(pid=68906) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 639.130160] env[68906]: DEBUG nova.network.neutron [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] [instance: d2258ded-478a-4530-b940-386286702048] Successfully updated port: aa68d5f0-3302-4547-984f-1d67a625a3ca {{(pid=68906) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 639.173143] env[68906]: DEBUG oslo_concurrency.lockutils [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Acquiring lock "refresh_cache-d2258ded-478a-4530-b940-386286702048" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 639.173143] env[68906]: DEBUG oslo_concurrency.lockutils [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Acquired lock "refresh_cache-d2258ded-478a-4530-b940-386286702048" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 639.173143] env[68906]: DEBUG nova.network.neutron [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] [instance: d2258ded-478a-4530-b940-386286702048] Building network info cache for instance {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 639.394712] env[68906]: DEBUG nova.network.neutron [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] [instance: d2258ded-478a-4530-b940-386286702048] Instance cache missing network info. {{(pid=68906) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 639.772018] env[68906]: DEBUG nova.network.neutron [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] Successfully created port: f99c8fa6-99d5-43ed-b528-9d2b22675c2b {{(pid=68906) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 640.073947] env[68906]: DEBUG nova.network.neutron [req-a39e41a0-9f24-4cc0-945f-af6777119471 req-4201beb8-2596-430a-b61e-dbea076041cd service nova] [instance: 57feb127-36f1-403c-bbca-7054286c1972] Updated VIF entry in instance network info cache for port 6c63bf97-aa05-4edb-95ea-4046e0527259. {{(pid=68906) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 640.073947] env[68906]: DEBUG nova.network.neutron [req-a39e41a0-9f24-4cc0-945f-af6777119471 req-4201beb8-2596-430a-b61e-dbea076041cd service nova] [instance: 57feb127-36f1-403c-bbca-7054286c1972] Updating instance_info_cache with network_info: [{"id": "6c63bf97-aa05-4edb-95ea-4046e0527259", "address": "fa:16:3e:fa:b4:2c", "network": {"id": "63efabfb-0028-4758-9626-5f9860440121", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.104", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "1ae7bf3a375d41c6af5e7536af51ffd1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "69054a13-b7ef-44e1-bd3b-3ca5ba602848", "external-id": "nsx-vlan-transportzone-153", "segmentation_id": 153, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6c63bf97-aa", "ovs_interfaceid": "6c63bf97-aa05-4edb-95ea-4046e0527259", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 640.089229] env[68906]: DEBUG oslo_concurrency.lockutils [req-a39e41a0-9f24-4cc0-945f-af6777119471 req-4201beb8-2596-430a-b61e-dbea076041cd service nova] Releasing lock "refresh_cache-57feb127-36f1-403c-bbca-7054286c1972" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 640.089229] env[68906]: DEBUG nova.compute.manager [req-a39e41a0-9f24-4cc0-945f-af6777119471 req-4201beb8-2596-430a-b61e-dbea076041cd service nova] [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] Received event network-vif-plugged-9bdacd11-d4fc-4e1f-aac2-e4d8d097bac8 {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 640.089229] env[68906]: DEBUG oslo_concurrency.lockutils [req-a39e41a0-9f24-4cc0-945f-af6777119471 req-4201beb8-2596-430a-b61e-dbea076041cd service nova] Acquiring lock "e2ee8d01-b1d3-4bde-81ae-668ffeef42b0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 640.089229] env[68906]: DEBUG oslo_concurrency.lockutils [req-a39e41a0-9f24-4cc0-945f-af6777119471 req-4201beb8-2596-430a-b61e-dbea076041cd service nova] Lock "e2ee8d01-b1d3-4bde-81ae-668ffeef42b0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 640.089477] env[68906]: DEBUG oslo_concurrency.lockutils [req-a39e41a0-9f24-4cc0-945f-af6777119471 req-4201beb8-2596-430a-b61e-dbea076041cd service nova] Lock "e2ee8d01-b1d3-4bde-81ae-668ffeef42b0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 640.089477] env[68906]: DEBUG nova.compute.manager [req-a39e41a0-9f24-4cc0-945f-af6777119471 req-4201beb8-2596-430a-b61e-dbea076041cd service nova] [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] No waiting events found dispatching network-vif-plugged-9bdacd11-d4fc-4e1f-aac2-e4d8d097bac8 {{(pid=68906) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 640.089477] env[68906]: WARNING nova.compute.manager [req-a39e41a0-9f24-4cc0-945f-af6777119471 req-4201beb8-2596-430a-b61e-dbea076041cd service nova] [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] Received unexpected event network-vif-plugged-9bdacd11-d4fc-4e1f-aac2-e4d8d097bac8 for instance with vm_state building and task_state spawning. [ 640.089477] env[68906]: DEBUG nova.compute.manager [req-a39e41a0-9f24-4cc0-945f-af6777119471 req-4201beb8-2596-430a-b61e-dbea076041cd service nova] [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] Received event network-changed-9bdacd11-d4fc-4e1f-aac2-e4d8d097bac8 {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 640.089627] env[68906]: DEBUG nova.compute.manager [req-a39e41a0-9f24-4cc0-945f-af6777119471 req-4201beb8-2596-430a-b61e-dbea076041cd service nova] [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] Refreshing instance network info cache due to event network-changed-9bdacd11-d4fc-4e1f-aac2-e4d8d097bac8. {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 640.089627] env[68906]: DEBUG oslo_concurrency.lockutils [req-a39e41a0-9f24-4cc0-945f-af6777119471 req-4201beb8-2596-430a-b61e-dbea076041cd service nova] Acquiring lock "refresh_cache-e2ee8d01-b1d3-4bde-81ae-668ffeef42b0" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 640.089627] env[68906]: DEBUG oslo_concurrency.lockutils [req-a39e41a0-9f24-4cc0-945f-af6777119471 req-4201beb8-2596-430a-b61e-dbea076041cd service nova] Acquired lock "refresh_cache-e2ee8d01-b1d3-4bde-81ae-668ffeef42b0" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 640.089627] env[68906]: DEBUG nova.network.neutron [req-a39e41a0-9f24-4cc0-945f-af6777119471 req-4201beb8-2596-430a-b61e-dbea076041cd service nova] [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] Refreshing network info cache for port 9bdacd11-d4fc-4e1f-aac2-e4d8d097bac8 {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 640.175034] env[68906]: DEBUG nova.network.neutron [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] Updating instance_info_cache with network_info: [{"id": "25488db1-53a4-49f8-9b01-2243a420bdbb", "address": "fa:16:3e:a9:b2:61", "network": {"id": "660d72f0-d6d8-4de8-bfee-2acd35981f86", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-571430305-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "43130cf9c64b405a9e5e0094228c72de", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a64108f9-df0a-4feb-bbb5-97f5841c356c", "external-id": "nsx-vlan-transportzone-67", "segmentation_id": 67, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap25488db1-53", "ovs_interfaceid": "25488db1-53a4-49f8-9b01-2243a420bdbb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 640.196911] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Releasing lock "refresh_cache-46481a4e-ac53-456d-b6cb-9f3ffbccf407" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 640.197240] env[68906]: DEBUG nova.compute.manager [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] Instance network_info: |[{"id": "25488db1-53a4-49f8-9b01-2243a420bdbb", "address": "fa:16:3e:a9:b2:61", "network": {"id": "660d72f0-d6d8-4de8-bfee-2acd35981f86", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-571430305-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "43130cf9c64b405a9e5e0094228c72de", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a64108f9-df0a-4feb-bbb5-97f5841c356c", "external-id": "nsx-vlan-transportzone-67", "segmentation_id": 67, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap25488db1-53", "ovs_interfaceid": "25488db1-53a4-49f8-9b01-2243a420bdbb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 640.197633] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:a9:b2:61', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'a64108f9-df0a-4feb-bbb5-97f5841c356c', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '25488db1-53a4-49f8-9b01-2243a420bdbb', 'vif_model': 'vmxnet3'}] {{(pid=68906) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 640.214654] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Creating folder: Project (43130cf9c64b405a9e5e0094228c72de). Parent ref: group-v694750. {{(pid=68906) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 640.214654] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a3fcc339-47f7-47bd-bc19-08f32bc7600b {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 640.228801] env[68906]: INFO nova.virt.vmwareapi.vm_util [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Created folder: Project (43130cf9c64b405a9e5e0094228c72de) in parent group-v694750. [ 640.228801] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Creating folder: Instances. Parent ref: group-v694757. {{(pid=68906) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 640.228801] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-5bea94b3-9f64-4723-b58e-6c4ae4c8fc17 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 640.238840] env[68906]: INFO nova.virt.vmwareapi.vm_util [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Created folder: Instances in parent group-v694757. [ 640.238840] env[68906]: DEBUG oslo.service.loopingcall [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 640.238840] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] Creating VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 640.242568] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-6c81ec96-b412-44e0-a4e7-471563eae3ca {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 640.275581] env[68906]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 640.275581] env[68906]: value = "task-3475260" [ 640.275581] env[68906]: _type = "Task" [ 640.275581] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 640.286061] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475260, 'name': CreateVM_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 640.488788] env[68906]: DEBUG nova.network.neutron [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] [instance: d2258ded-478a-4530-b940-386286702048] Updating instance_info_cache with network_info: [{"id": "aa68d5f0-3302-4547-984f-1d67a625a3ca", "address": "fa:16:3e:73:9d:b5", "network": {"id": "033447c9-ca96-4ca5-acc7-ad331d809250", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1285164177-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7d38ee4d80bc4d1f87f4cb1f3907fb54", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d4b43a78-f49b-4132-ab2e-6e28769a9498", "external-id": "nsx-vlan-transportzone-737", "segmentation_id": 737, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapaa68d5f0-33", "ovs_interfaceid": "aa68d5f0-3302-4547-984f-1d67a625a3ca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 640.501261] env[68906]: DEBUG oslo_concurrency.lockutils [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Releasing lock "refresh_cache-d2258ded-478a-4530-b940-386286702048" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 640.501261] env[68906]: DEBUG nova.compute.manager [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] [instance: d2258ded-478a-4530-b940-386286702048] Instance network_info: |[{"id": "aa68d5f0-3302-4547-984f-1d67a625a3ca", "address": "fa:16:3e:73:9d:b5", "network": {"id": "033447c9-ca96-4ca5-acc7-ad331d809250", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1285164177-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7d38ee4d80bc4d1f87f4cb1f3907fb54", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d4b43a78-f49b-4132-ab2e-6e28769a9498", "external-id": "nsx-vlan-transportzone-737", "segmentation_id": 737, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapaa68d5f0-33", "ovs_interfaceid": "aa68d5f0-3302-4547-984f-1d67a625a3ca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 640.501443] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] [instance: d2258ded-478a-4530-b940-386286702048] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:73:9d:b5', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'd4b43a78-f49b-4132-ab2e-6e28769a9498', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'aa68d5f0-3302-4547-984f-1d67a625a3ca', 'vif_model': 'vmxnet3'}] {{(pid=68906) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 640.516155] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Creating folder: Project (7d38ee4d80bc4d1f87f4cb1f3907fb54). Parent ref: group-v694750. {{(pid=68906) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 640.516155] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-6d71f59b-b63d-42a9-8495-608b543a7118 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 640.530530] env[68906]: INFO nova.virt.vmwareapi.vm_util [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Created folder: Project (7d38ee4d80bc4d1f87f4cb1f3907fb54) in parent group-v694750. [ 640.530620] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Creating folder: Instances. Parent ref: group-v694760. {{(pid=68906) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 640.530869] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-5803048e-fd34-4d67-96dd-369bc4e31751 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 640.544925] env[68906]: INFO nova.virt.vmwareapi.vm_util [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Created folder: Instances in parent group-v694760. [ 640.544925] env[68906]: DEBUG oslo.service.loopingcall [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 640.544925] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d2258ded-478a-4530-b940-386286702048] Creating VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 640.544925] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-ee7e4f59-13a3-4aeb-8c3d-dcf4010fb992 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 640.566461] env[68906]: DEBUG nova.network.neutron [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] [instance: da0c4340-a657-43bd-9a98-4c8f50add720] Successfully updated port: eb5a0594-e1fd-4786-8944-e71fc85436cb {{(pid=68906) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 640.579758] env[68906]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 640.579758] env[68906]: value = "task-3475263" [ 640.579758] env[68906]: _type = "Task" [ 640.579758] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 640.584126] env[68906]: DEBUG oslo_concurrency.lockutils [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Acquiring lock "refresh_cache-da0c4340-a657-43bd-9a98-4c8f50add720" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 640.584272] env[68906]: DEBUG oslo_concurrency.lockutils [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Acquired lock "refresh_cache-da0c4340-a657-43bd-9a98-4c8f50add720" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 640.584421] env[68906]: DEBUG nova.network.neutron [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] [instance: da0c4340-a657-43bd-9a98-4c8f50add720] Building network info cache for instance {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 640.593574] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475263, 'name': CreateVM_Task} progress is 6%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 640.710496] env[68906]: DEBUG nova.network.neutron [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] [instance: da0c4340-a657-43bd-9a98-4c8f50add720] Instance cache missing network info. {{(pid=68906) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 640.787443] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475260, 'name': CreateVM_Task, 'duration_secs': 0.319534} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 640.787635] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] Created VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 640.788387] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 640.788603] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 640.788938] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 640.790036] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-675a011a-1314-4e99-97a3-f121b767a142 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 640.794735] env[68906]: DEBUG oslo_vmware.api [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Waiting for the task: (returnval){ [ 640.794735] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]52553b66-fe42-07c6-a7f8-52591a79aaba" [ 640.794735] env[68906]: _type = "Task" [ 640.794735] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 640.803707] env[68906]: DEBUG oslo_vmware.api [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]52553b66-fe42-07c6-a7f8-52591a79aaba, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 641.093908] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475263, 'name': CreateVM_Task, 'duration_secs': 0.296192} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 641.093908] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d2258ded-478a-4530-b940-386286702048] Created VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 641.095948] env[68906]: DEBUG oslo_concurrency.lockutils [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 641.254190] env[68906]: DEBUG nova.network.neutron [req-a39e41a0-9f24-4cc0-945f-af6777119471 req-4201beb8-2596-430a-b61e-dbea076041cd service nova] [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] Updated VIF entry in instance network info cache for port 9bdacd11-d4fc-4e1f-aac2-e4d8d097bac8. {{(pid=68906) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 641.254569] env[68906]: DEBUG nova.network.neutron [req-a39e41a0-9f24-4cc0-945f-af6777119471 req-4201beb8-2596-430a-b61e-dbea076041cd service nova] [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] Updating instance_info_cache with network_info: [{"id": "9bdacd11-d4fc-4e1f-aac2-e4d8d097bac8", "address": "fa:16:3e:9a:6f:46", "network": {"id": "63efabfb-0028-4758-9626-5f9860440121", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.206", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "1ae7bf3a375d41c6af5e7536af51ffd1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "69054a13-b7ef-44e1-bd3b-3ca5ba602848", "external-id": "nsx-vlan-transportzone-153", "segmentation_id": 153, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap9bdacd11-d4", "ovs_interfaceid": "9bdacd11-d4fc-4e1f-aac2-e4d8d097bac8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 641.264701] env[68906]: DEBUG oslo_concurrency.lockutils [req-a39e41a0-9f24-4cc0-945f-af6777119471 req-4201beb8-2596-430a-b61e-dbea076041cd service nova] Releasing lock "refresh_cache-e2ee8d01-b1d3-4bde-81ae-668ffeef42b0" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 641.295989] env[68906]: DEBUG nova.network.neutron [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] [instance: da0c4340-a657-43bd-9a98-4c8f50add720] Updating instance_info_cache with network_info: [{"id": "eb5a0594-e1fd-4786-8944-e71fc85436cb", "address": "fa:16:3e:48:ee:36", "network": {"id": "63efabfb-0028-4758-9626-5f9860440121", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.209", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "1ae7bf3a375d41c6af5e7536af51ffd1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "69054a13-b7ef-44e1-bd3b-3ca5ba602848", "external-id": "nsx-vlan-transportzone-153", "segmentation_id": 153, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapeb5a0594-e1", "ovs_interfaceid": "eb5a0594-e1fd-4786-8944-e71fc85436cb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 641.316158] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 641.316818] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] Processing image b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 641.317317] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 641.318134] env[68906]: DEBUG oslo_concurrency.lockutils [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Releasing lock "refresh_cache-da0c4340-a657-43bd-9a98-4c8f50add720" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 641.319967] env[68906]: DEBUG nova.compute.manager [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] [instance: da0c4340-a657-43bd-9a98-4c8f50add720] Instance network_info: |[{"id": "eb5a0594-e1fd-4786-8944-e71fc85436cb", "address": "fa:16:3e:48:ee:36", "network": {"id": "63efabfb-0028-4758-9626-5f9860440121", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.209", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "1ae7bf3a375d41c6af5e7536af51ffd1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "69054a13-b7ef-44e1-bd3b-3ca5ba602848", "external-id": "nsx-vlan-transportzone-153", "segmentation_id": 153, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapeb5a0594-e1", "ovs_interfaceid": "eb5a0594-e1fd-4786-8944-e71fc85436cb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 641.319967] env[68906]: DEBUG oslo_concurrency.lockutils [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 641.320153] env[68906]: DEBUG oslo_concurrency.lockutils [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 641.321181] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] [instance: da0c4340-a657-43bd-9a98-4c8f50add720] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:48:ee:36', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '69054a13-b7ef-44e1-bd3b-3ca5ba602848', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'eb5a0594-e1fd-4786-8944-e71fc85436cb', 'vif_model': 'vmxnet3'}] {{(pid=68906) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 641.332201] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Creating folder: Project (3686ca37c45c46e1bcc96a7abe3b5fa0). Parent ref: group-v694750. {{(pid=68906) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 641.332543] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-311f2c53-44f7-433b-9446-3df1dc9b7971 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 641.334806] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e6570f12-c6b8-4d82-b113-6f12e5db944a {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 641.342020] env[68906]: DEBUG oslo_vmware.api [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Waiting for the task: (returnval){ [ 641.342020] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]52c2bd25-d100-7140-865a-dac45543ccee" [ 641.342020] env[68906]: _type = "Task" [ 641.342020] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 641.346432] env[68906]: INFO nova.virt.vmwareapi.vm_util [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Created folder: Project (3686ca37c45c46e1bcc96a7abe3b5fa0) in parent group-v694750. [ 641.346687] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Creating folder: Instances. Parent ref: group-v694763. {{(pid=68906) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 641.346914] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e5c7b322-3aa2-4a27-beef-adefaa2184d0 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 641.352731] env[68906]: DEBUG oslo_vmware.api [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]52c2bd25-d100-7140-865a-dac45543ccee, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 641.361494] env[68906]: INFO nova.virt.vmwareapi.vm_util [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Created folder: Instances in parent group-v694763. [ 641.361741] env[68906]: DEBUG oslo.service.loopingcall [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 641.361941] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: da0c4340-a657-43bd-9a98-4c8f50add720] Creating VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 641.362319] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-743ba9f0-fbb4-4165-bd89-a2e7da0d237a {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 641.383438] env[68906]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 641.383438] env[68906]: value = "task-3475266" [ 641.383438] env[68906]: _type = "Task" [ 641.383438] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 641.393052] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475266, 'name': CreateVM_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 641.749445] env[68906]: DEBUG nova.network.neutron [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] Successfully updated port: f99c8fa6-99d5-43ed-b528-9d2b22675c2b {{(pid=68906) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 641.764408] env[68906]: DEBUG oslo_concurrency.lockutils [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Acquiring lock "refresh_cache-0540a4dc-1b86-4776-b633-f540af168a2b" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 641.764746] env[68906]: DEBUG oslo_concurrency.lockutils [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Acquired lock "refresh_cache-0540a4dc-1b86-4776-b633-f540af168a2b" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 641.764746] env[68906]: DEBUG nova.network.neutron [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] Building network info cache for instance {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 641.814208] env[68906]: DEBUG nova.network.neutron [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] Instance cache missing network info. {{(pid=68906) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 641.857124] env[68906]: DEBUG oslo_concurrency.lockutils [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 641.857365] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] [instance: d2258ded-478a-4530-b940-386286702048] Processing image b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 641.857604] env[68906]: DEBUG oslo_concurrency.lockutils [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 641.896264] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475266, 'name': CreateVM_Task, 'duration_secs': 0.330788} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 641.896506] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: da0c4340-a657-43bd-9a98-4c8f50add720] Created VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 641.897102] env[68906]: DEBUG oslo_concurrency.lockutils [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 641.897348] env[68906]: DEBUG oslo_concurrency.lockutils [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 641.897677] env[68906]: DEBUG oslo_concurrency.lockutils [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 641.898188] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-beef6d2e-780a-4f2a-8da8-5b24c655f402 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 641.903215] env[68906]: DEBUG oslo_vmware.api [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Waiting for the task: (returnval){ [ 641.903215] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]52d06730-cc90-1648-368b-d23a2e19424e" [ 641.903215] env[68906]: _type = "Task" [ 641.903215] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 641.914086] env[68906]: DEBUG oslo_vmware.api [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]52d06730-cc90-1648-368b-d23a2e19424e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 642.173604] env[68906]: DEBUG nova.network.neutron [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] Updating instance_info_cache with network_info: [{"id": "f99c8fa6-99d5-43ed-b528-9d2b22675c2b", "address": "fa:16:3e:48:85:3d", "network": {"id": "0b2fac70-58a4-4e6b-88d2-37fe26c225da", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-11552093-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "5c2305d3a9b443f29494e5e234e0f492", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1a8c8175-1197-4f12-baac-ef6aba95f585", "external-id": "nsx-vlan-transportzone-832", "segmentation_id": 832, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf99c8fa6-99", "ovs_interfaceid": "f99c8fa6-99d5-43ed-b528-9d2b22675c2b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 642.188991] env[68906]: DEBUG oslo_concurrency.lockutils [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Releasing lock "refresh_cache-0540a4dc-1b86-4776-b633-f540af168a2b" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 642.188991] env[68906]: DEBUG nova.compute.manager [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] Instance network_info: |[{"id": "f99c8fa6-99d5-43ed-b528-9d2b22675c2b", "address": "fa:16:3e:48:85:3d", "network": {"id": "0b2fac70-58a4-4e6b-88d2-37fe26c225da", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-11552093-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "5c2305d3a9b443f29494e5e234e0f492", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1a8c8175-1197-4f12-baac-ef6aba95f585", "external-id": "nsx-vlan-transportzone-832", "segmentation_id": 832, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf99c8fa6-99", "ovs_interfaceid": "f99c8fa6-99d5-43ed-b528-9d2b22675c2b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 642.189242] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:48:85:3d', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '1a8c8175-1197-4f12-baac-ef6aba95f585', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'f99c8fa6-99d5-43ed-b528-9d2b22675c2b', 'vif_model': 'vmxnet3'}] {{(pid=68906) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 642.200683] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Creating folder: Project (5c2305d3a9b443f29494e5e234e0f492). Parent ref: group-v694750. {{(pid=68906) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 642.200905] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-5288468d-6c0c-496a-9068-694309c7b862 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 642.213568] env[68906]: INFO nova.virt.vmwareapi.vm_util [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Created folder: Project (5c2305d3a9b443f29494e5e234e0f492) in parent group-v694750. [ 642.213768] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Creating folder: Instances. Parent ref: group-v694766. {{(pid=68906) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 642.214263] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-68771133-78f8-4c8e-9110-68f47b2eabe1 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 642.227873] env[68906]: INFO nova.virt.vmwareapi.vm_util [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Created folder: Instances in parent group-v694766. [ 642.228097] env[68906]: DEBUG oslo.service.loopingcall [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 642.228302] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] Creating VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 642.228633] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-73115a76-db68-451a-a4c6-16dd2624a7f9 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 642.254062] env[68906]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 642.254062] env[68906]: value = "task-3475269" [ 642.254062] env[68906]: _type = "Task" [ 642.254062] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 642.267063] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475269, 'name': CreateVM_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 642.415087] env[68906]: DEBUG oslo_concurrency.lockutils [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 642.415434] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] [instance: da0c4340-a657-43bd-9a98-4c8f50add720] Processing image b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 642.416095] env[68906]: DEBUG oslo_concurrency.lockutils [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 642.687893] env[68906]: DEBUG oslo_concurrency.lockutils [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Acquiring lock "4edb8b9f-b608-4be8-bfd3-65642710f9bd" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 642.688194] env[68906]: DEBUG oslo_concurrency.lockutils [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Lock "4edb8b9f-b608-4be8-bfd3-65642710f9bd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 642.694291] env[68906]: DEBUG oslo_concurrency.lockutils [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Acquiring lock "d6ca51b9-b284-405c-878e-fdbc326b73e1" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 642.694742] env[68906]: DEBUG oslo_concurrency.lockutils [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Lock "d6ca51b9-b284-405c-878e-fdbc326b73e1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 642.703451] env[68906]: DEBUG nova.compute.manager [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 642.712821] env[68906]: DEBUG nova.compute.manager [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 642.764894] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475269, 'name': CreateVM_Task, 'duration_secs': 0.311448} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 642.766390] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] Created VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 642.775650] env[68906]: DEBUG oslo_concurrency.lockutils [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 642.775650] env[68906]: DEBUG oslo_concurrency.lockutils [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 642.775858] env[68906]: DEBUG oslo_concurrency.lockutils [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 642.777137] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4ee03833-353a-42f3-bf61-c1b182e37e87 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 642.782995] env[68906]: DEBUG oslo_vmware.api [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Waiting for the task: (returnval){ [ 642.782995] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]52598124-5467-ed28-7044-eb765a4541cf" [ 642.782995] env[68906]: _type = "Task" [ 642.782995] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 642.792326] env[68906]: DEBUG oslo_vmware.api [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]52598124-5467-ed28-7044-eb765a4541cf, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 642.796729] env[68906]: DEBUG oslo_concurrency.lockutils [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 642.796965] env[68906]: DEBUG oslo_concurrency.lockutils [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 642.798597] env[68906]: INFO nova.compute.claims [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 642.819673] env[68906]: DEBUG oslo_concurrency.lockutils [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 643.043109] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c8ee7e60-b0b8-42bb-b384-5690d1378719 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 643.052116] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-500f8f5c-a59a-4ccf-9c36-a724b3fa852d {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 643.091421] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2e686252-8aeb-43a5-90e0-c4956c7b5cac {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 643.099767] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-27845a1b-cd20-458c-b353-0440ee3fc8cc {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 643.115333] env[68906]: DEBUG nova.compute.provider_tree [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 643.133586] env[68906]: DEBUG nova.scheduler.client.report [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 643.170103] env[68906]: DEBUG oslo_concurrency.lockutils [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.373s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 643.170723] env[68906]: DEBUG nova.compute.manager [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Start building networks asynchronously for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 643.174512] env[68906]: DEBUG oslo_concurrency.lockutils [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.356s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 643.176266] env[68906]: INFO nova.compute.claims [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 643.237631] env[68906]: DEBUG nova.compute.utils [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Using /dev/sd instead of None {{(pid=68906) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 643.242804] env[68906]: DEBUG nova.compute.manager [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Allocating IP information in the background. {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 643.242804] env[68906]: DEBUG nova.network.neutron [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] allocate_for_instance() {{(pid=68906) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 643.251218] env[68906]: DEBUG nova.compute.manager [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Start building block device mappings for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 643.296677] env[68906]: DEBUG oslo_concurrency.lockutils [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 643.297025] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] Processing image b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 643.297163] env[68906]: DEBUG oslo_concurrency.lockutils [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 643.345362] env[68906]: DEBUG nova.compute.manager [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Start spawning the instance on the hypervisor. {{(pid=68906) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 643.384773] env[68906]: DEBUG nova.virt.hardware [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T13:00:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T13:00:23Z,direct_url=,disk_format='vmdk',id=b1400c31-d33b-4e13-944f-4c645e62493e,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='1ae7bf3a375d41c6af5e7536af51ffd1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T13:00:24Z,virtual_size=,visibility=), allow threads: False {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 643.386331] env[68906]: DEBUG nova.virt.hardware [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Flavor limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 643.386331] env[68906]: DEBUG nova.virt.hardware [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Image limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 643.386835] env[68906]: DEBUG nova.virt.hardware [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Flavor pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 643.386835] env[68906]: DEBUG nova.virt.hardware [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Image pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 643.386835] env[68906]: DEBUG nova.virt.hardware [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 643.387283] env[68906]: DEBUG nova.virt.hardware [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 643.387283] env[68906]: DEBUG nova.virt.hardware [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 643.387419] env[68906]: DEBUG nova.virt.hardware [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Got 1 possible topologies {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 643.387527] env[68906]: DEBUG nova.virt.hardware [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 643.387761] env[68906]: DEBUG nova.virt.hardware [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 643.389034] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c6de934-4760-4353-96fc-ffd10e12f32a {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 643.395312] env[68906]: DEBUG nova.policy [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c08a6c439ba94d18b742a133848aaaae', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0e206dedfb584e219a7f5dd633032515', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68906) authorize /opt/stack/nova/nova/policy.py:203}} [ 643.404018] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a03ba3bf-8c05-4b4c-abd8-0508a72d83c6 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 643.438514] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c30b815-0508-4bf9-ab5c-018cb57a899c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 643.445751] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e8ef10fd-b16a-4f12-aa57-a4609eda667a {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 643.484947] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8d856e97-f11b-4287-9545-d4b102a89269 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 643.493492] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f43c7e20-4864-47de-84bd-a062659f3581 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 643.509717] env[68906]: DEBUG nova.compute.provider_tree [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 643.523591] env[68906]: DEBUG nova.scheduler.client.report [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 643.542166] env[68906]: DEBUG oslo_concurrency.lockutils [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.367s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 643.542166] env[68906]: DEBUG nova.compute.manager [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Start building networks asynchronously for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 643.595620] env[68906]: DEBUG nova.compute.utils [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Using /dev/sd instead of None {{(pid=68906) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 643.597059] env[68906]: DEBUG nova.compute.manager [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Allocating IP information in the background. {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 643.597185] env[68906]: DEBUG nova.network.neutron [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] allocate_for_instance() {{(pid=68906) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 643.617432] env[68906]: DEBUG nova.compute.manager [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Start building block device mappings for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 643.705231] env[68906]: DEBUG nova.compute.manager [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Start spawning the instance on the hypervisor. {{(pid=68906) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 643.742582] env[68906]: DEBUG nova.virt.hardware [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T13:00:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T13:00:23Z,direct_url=,disk_format='vmdk',id=b1400c31-d33b-4e13-944f-4c645e62493e,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='1ae7bf3a375d41c6af5e7536af51ffd1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T13:00:24Z,virtual_size=,visibility=), allow threads: False {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 643.742821] env[68906]: DEBUG nova.virt.hardware [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Flavor limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 643.743518] env[68906]: DEBUG nova.virt.hardware [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Image limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 643.743518] env[68906]: DEBUG nova.virt.hardware [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Flavor pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 643.744229] env[68906]: DEBUG nova.virt.hardware [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Image pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 643.744229] env[68906]: DEBUG nova.virt.hardware [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 643.744229] env[68906]: DEBUG nova.virt.hardware [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 643.744379] env[68906]: DEBUG nova.virt.hardware [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 643.744802] env[68906]: DEBUG nova.virt.hardware [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Got 1 possible topologies {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 643.744802] env[68906]: DEBUG nova.virt.hardware [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 643.744909] env[68906]: DEBUG nova.virt.hardware [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 643.746056] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a01a2401-9a93-4eb2-ae53-1441cba92f36 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 643.750548] env[68906]: DEBUG nova.policy [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e208107293fd4f82af1f396d43464b69', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '90f212f7916446919081fcdc0527ebb0', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68906) authorize /opt/stack/nova/nova/policy.py:203}} [ 643.758623] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de0ff7bc-4a54-4053-9764-27136241cbc7 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 643.803703] env[68906]: DEBUG nova.network.neutron [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Successfully created port: 57b48c3e-57a8-4ee0-a974-0813b3871e35 {{(pid=68906) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 643.839701] env[68906]: DEBUG nova.compute.manager [req-65ffb53b-396c-44ac-b193-d580ff975102 req-0f425d44-ee40-480d-b8b5-3d54430e692c service nova] [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] Received event network-vif-plugged-25488db1-53a4-49f8-9b01-2243a420bdbb {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 643.839843] env[68906]: DEBUG oslo_concurrency.lockutils [req-65ffb53b-396c-44ac-b193-d580ff975102 req-0f425d44-ee40-480d-b8b5-3d54430e692c service nova] Acquiring lock "46481a4e-ac53-456d-b6cb-9f3ffbccf407-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 643.840201] env[68906]: DEBUG oslo_concurrency.lockutils [req-65ffb53b-396c-44ac-b193-d580ff975102 req-0f425d44-ee40-480d-b8b5-3d54430e692c service nova] Lock "46481a4e-ac53-456d-b6cb-9f3ffbccf407-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 643.840960] env[68906]: DEBUG oslo_concurrency.lockutils [req-65ffb53b-396c-44ac-b193-d580ff975102 req-0f425d44-ee40-480d-b8b5-3d54430e692c service nova] Lock "46481a4e-ac53-456d-b6cb-9f3ffbccf407-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 643.840960] env[68906]: DEBUG nova.compute.manager [req-65ffb53b-396c-44ac-b193-d580ff975102 req-0f425d44-ee40-480d-b8b5-3d54430e692c service nova] [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] No waiting events found dispatching network-vif-plugged-25488db1-53a4-49f8-9b01-2243a420bdbb {{(pid=68906) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 643.841883] env[68906]: WARNING nova.compute.manager [req-65ffb53b-396c-44ac-b193-d580ff975102 req-0f425d44-ee40-480d-b8b5-3d54430e692c service nova] [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] Received unexpected event network-vif-plugged-25488db1-53a4-49f8-9b01-2243a420bdbb for instance with vm_state building and task_state spawning. [ 643.842187] env[68906]: DEBUG nova.compute.manager [req-65ffb53b-396c-44ac-b193-d580ff975102 req-0f425d44-ee40-480d-b8b5-3d54430e692c service nova] [instance: d2258ded-478a-4530-b940-386286702048] Received event network-vif-plugged-aa68d5f0-3302-4547-984f-1d67a625a3ca {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 643.842522] env[68906]: DEBUG oslo_concurrency.lockutils [req-65ffb53b-396c-44ac-b193-d580ff975102 req-0f425d44-ee40-480d-b8b5-3d54430e692c service nova] Acquiring lock "d2258ded-478a-4530-b940-386286702048-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 643.842823] env[68906]: DEBUG oslo_concurrency.lockutils [req-65ffb53b-396c-44ac-b193-d580ff975102 req-0f425d44-ee40-480d-b8b5-3d54430e692c service nova] Lock "d2258ded-478a-4530-b940-386286702048-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 643.843054] env[68906]: DEBUG oslo_concurrency.lockutils [req-65ffb53b-396c-44ac-b193-d580ff975102 req-0f425d44-ee40-480d-b8b5-3d54430e692c service nova] Lock "d2258ded-478a-4530-b940-386286702048-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 643.843281] env[68906]: DEBUG nova.compute.manager [req-65ffb53b-396c-44ac-b193-d580ff975102 req-0f425d44-ee40-480d-b8b5-3d54430e692c service nova] [instance: d2258ded-478a-4530-b940-386286702048] No waiting events found dispatching network-vif-plugged-aa68d5f0-3302-4547-984f-1d67a625a3ca {{(pid=68906) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 643.843497] env[68906]: WARNING nova.compute.manager [req-65ffb53b-396c-44ac-b193-d580ff975102 req-0f425d44-ee40-480d-b8b5-3d54430e692c service nova] [instance: d2258ded-478a-4530-b940-386286702048] Received unexpected event network-vif-plugged-aa68d5f0-3302-4547-984f-1d67a625a3ca for instance with vm_state building and task_state spawning. [ 643.843787] env[68906]: DEBUG nova.compute.manager [req-65ffb53b-396c-44ac-b193-d580ff975102 req-0f425d44-ee40-480d-b8b5-3d54430e692c service nova] [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] Received event network-changed-25488db1-53a4-49f8-9b01-2243a420bdbb {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 643.844066] env[68906]: DEBUG nova.compute.manager [req-65ffb53b-396c-44ac-b193-d580ff975102 req-0f425d44-ee40-480d-b8b5-3d54430e692c service nova] [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] Refreshing instance network info cache due to event network-changed-25488db1-53a4-49f8-9b01-2243a420bdbb. {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 643.844773] env[68906]: DEBUG oslo_concurrency.lockutils [req-65ffb53b-396c-44ac-b193-d580ff975102 req-0f425d44-ee40-480d-b8b5-3d54430e692c service nova] Acquiring lock "refresh_cache-46481a4e-ac53-456d-b6cb-9f3ffbccf407" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 643.844773] env[68906]: DEBUG oslo_concurrency.lockutils [req-65ffb53b-396c-44ac-b193-d580ff975102 req-0f425d44-ee40-480d-b8b5-3d54430e692c service nova] Acquired lock "refresh_cache-46481a4e-ac53-456d-b6cb-9f3ffbccf407" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 643.844773] env[68906]: DEBUG nova.network.neutron [req-65ffb53b-396c-44ac-b193-d580ff975102 req-0f425d44-ee40-480d-b8b5-3d54430e692c service nova] [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] Refreshing network info cache for port 25488db1-53a4-49f8-9b01-2243a420bdbb {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 644.163400] env[68906]: DEBUG oslo_concurrency.lockutils [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Acquiring lock "f42056e5-52cb-4d69-8022-ca643c49194e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 644.163651] env[68906]: DEBUG oslo_concurrency.lockutils [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Lock "f42056e5-52cb-4d69-8022-ca643c49194e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 644.181772] env[68906]: DEBUG nova.compute.manager [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 644.265103] env[68906]: DEBUG oslo_concurrency.lockutils [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 644.265103] env[68906]: DEBUG oslo_concurrency.lockutils [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 644.265951] env[68906]: INFO nova.compute.claims [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 644.349071] env[68906]: DEBUG nova.network.neutron [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Successfully created port: 390f6aeb-46ca-4723-b36d-949492fa4618 {{(pid=68906) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 644.357438] env[68906]: DEBUG nova.compute.manager [req-c355da5d-c17b-418a-8e70-a7e291292473 req-bd8010df-9138-431f-a478-0d5f0092bbf6 service nova] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] Received event network-vif-plugged-f99c8fa6-99d5-43ed-b528-9d2b22675c2b {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 644.357642] env[68906]: DEBUG oslo_concurrency.lockutils [req-c355da5d-c17b-418a-8e70-a7e291292473 req-bd8010df-9138-431f-a478-0d5f0092bbf6 service nova] Acquiring lock "0540a4dc-1b86-4776-b633-f540af168a2b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 644.357841] env[68906]: DEBUG oslo_concurrency.lockutils [req-c355da5d-c17b-418a-8e70-a7e291292473 req-bd8010df-9138-431f-a478-0d5f0092bbf6 service nova] Lock "0540a4dc-1b86-4776-b633-f540af168a2b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 644.358046] env[68906]: DEBUG oslo_concurrency.lockutils [req-c355da5d-c17b-418a-8e70-a7e291292473 req-bd8010df-9138-431f-a478-0d5f0092bbf6 service nova] Lock "0540a4dc-1b86-4776-b633-f540af168a2b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 644.359015] env[68906]: DEBUG nova.compute.manager [req-c355da5d-c17b-418a-8e70-a7e291292473 req-bd8010df-9138-431f-a478-0d5f0092bbf6 service nova] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] No waiting events found dispatching network-vif-plugged-f99c8fa6-99d5-43ed-b528-9d2b22675c2b {{(pid=68906) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 644.359220] env[68906]: WARNING nova.compute.manager [req-c355da5d-c17b-418a-8e70-a7e291292473 req-bd8010df-9138-431f-a478-0d5f0092bbf6 service nova] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] Received unexpected event network-vif-plugged-f99c8fa6-99d5-43ed-b528-9d2b22675c2b for instance with vm_state building and task_state spawning. [ 644.427587] env[68906]: DEBUG nova.network.neutron [req-65ffb53b-396c-44ac-b193-d580ff975102 req-0f425d44-ee40-480d-b8b5-3d54430e692c service nova] [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] Updated VIF entry in instance network info cache for port 25488db1-53a4-49f8-9b01-2243a420bdbb. {{(pid=68906) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 644.427945] env[68906]: DEBUG nova.network.neutron [req-65ffb53b-396c-44ac-b193-d580ff975102 req-0f425d44-ee40-480d-b8b5-3d54430e692c service nova] [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] Updating instance_info_cache with network_info: [{"id": "25488db1-53a4-49f8-9b01-2243a420bdbb", "address": "fa:16:3e:a9:b2:61", "network": {"id": "660d72f0-d6d8-4de8-bfee-2acd35981f86", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-571430305-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "43130cf9c64b405a9e5e0094228c72de", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a64108f9-df0a-4feb-bbb5-97f5841c356c", "external-id": "nsx-vlan-transportzone-67", "segmentation_id": 67, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap25488db1-53", "ovs_interfaceid": "25488db1-53a4-49f8-9b01-2243a420bdbb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 644.440076] env[68906]: DEBUG oslo_concurrency.lockutils [req-65ffb53b-396c-44ac-b193-d580ff975102 req-0f425d44-ee40-480d-b8b5-3d54430e692c service nova] Releasing lock "refresh_cache-46481a4e-ac53-456d-b6cb-9f3ffbccf407" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 644.440332] env[68906]: DEBUG nova.compute.manager [req-65ffb53b-396c-44ac-b193-d580ff975102 req-0f425d44-ee40-480d-b8b5-3d54430e692c service nova] [instance: d2258ded-478a-4530-b940-386286702048] Received event network-changed-aa68d5f0-3302-4547-984f-1d67a625a3ca {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 644.440522] env[68906]: DEBUG nova.compute.manager [req-65ffb53b-396c-44ac-b193-d580ff975102 req-0f425d44-ee40-480d-b8b5-3d54430e692c service nova] [instance: d2258ded-478a-4530-b940-386286702048] Refreshing instance network info cache due to event network-changed-aa68d5f0-3302-4547-984f-1d67a625a3ca. {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 644.440726] env[68906]: DEBUG oslo_concurrency.lockutils [req-65ffb53b-396c-44ac-b193-d580ff975102 req-0f425d44-ee40-480d-b8b5-3d54430e692c service nova] Acquiring lock "refresh_cache-d2258ded-478a-4530-b940-386286702048" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 644.440862] env[68906]: DEBUG oslo_concurrency.lockutils [req-65ffb53b-396c-44ac-b193-d580ff975102 req-0f425d44-ee40-480d-b8b5-3d54430e692c service nova] Acquired lock "refresh_cache-d2258ded-478a-4530-b940-386286702048" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 644.441039] env[68906]: DEBUG nova.network.neutron [req-65ffb53b-396c-44ac-b193-d580ff975102 req-0f425d44-ee40-480d-b8b5-3d54430e692c service nova] [instance: d2258ded-478a-4530-b940-386286702048] Refreshing network info cache for port aa68d5f0-3302-4547-984f-1d67a625a3ca {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 644.527467] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c2218410-b448-4a0b-93b8-b26d5696c134 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 644.536205] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-199e39a6-e214-4dc7-949d-b5291780771e {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 644.576016] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e7d67cb2-fde0-45c7-b6c3-7799dbb00a14 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 644.585491] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-47e81d3c-56b9-4d08-a694-fafab5460480 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 644.602956] env[68906]: DEBUG nova.compute.provider_tree [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 644.618021] env[68906]: DEBUG nova.scheduler.client.report [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 644.632563] env[68906]: DEBUG oslo_concurrency.lockutils [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.368s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 644.635867] env[68906]: DEBUG nova.compute.manager [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Start building networks asynchronously for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 644.675476] env[68906]: DEBUG nova.compute.utils [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Using /dev/sd instead of None {{(pid=68906) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 644.676428] env[68906]: DEBUG nova.compute.manager [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Allocating IP information in the background. {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 644.676589] env[68906]: DEBUG nova.network.neutron [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] allocate_for_instance() {{(pid=68906) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 644.691883] env[68906]: DEBUG nova.compute.manager [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Start building block device mappings for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 644.768246] env[68906]: DEBUG nova.compute.manager [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Start spawning the instance on the hypervisor. {{(pid=68906) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 644.800667] env[68906]: DEBUG nova.virt.hardware [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T13:00:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T13:00:23Z,direct_url=,disk_format='vmdk',id=b1400c31-d33b-4e13-944f-4c645e62493e,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='1ae7bf3a375d41c6af5e7536af51ffd1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T13:00:24Z,virtual_size=,visibility=), allow threads: False {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 644.800927] env[68906]: DEBUG nova.virt.hardware [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Flavor limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 644.802163] env[68906]: DEBUG nova.virt.hardware [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Image limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 644.802163] env[68906]: DEBUG nova.virt.hardware [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Flavor pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 644.802163] env[68906]: DEBUG nova.virt.hardware [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Image pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 644.802163] env[68906]: DEBUG nova.virt.hardware [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 644.802163] env[68906]: DEBUG nova.virt.hardware [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 644.802442] env[68906]: DEBUG nova.virt.hardware [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 644.802442] env[68906]: DEBUG nova.virt.hardware [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Got 1 possible topologies {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 644.802442] env[68906]: DEBUG nova.virt.hardware [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 644.802579] env[68906]: DEBUG nova.virt.hardware [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 644.803648] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c61e41a-c186-42c5-9949-41ffbb6f9317 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 644.812491] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67565b46-7018-42b4-bb46-264d3bc82435 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 645.079319] env[68906]: DEBUG nova.policy [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7e8b8fc273be4fa49144f70d1b1b2a3a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3d3cc4c86bc14a69a001ef23df615f2c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68906) authorize /opt/stack/nova/nova/policy.py:203}} [ 645.094131] env[68906]: DEBUG nova.network.neutron [req-65ffb53b-396c-44ac-b193-d580ff975102 req-0f425d44-ee40-480d-b8b5-3d54430e692c service nova] [instance: d2258ded-478a-4530-b940-386286702048] Updated VIF entry in instance network info cache for port aa68d5f0-3302-4547-984f-1d67a625a3ca. {{(pid=68906) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 645.094490] env[68906]: DEBUG nova.network.neutron [req-65ffb53b-396c-44ac-b193-d580ff975102 req-0f425d44-ee40-480d-b8b5-3d54430e692c service nova] [instance: d2258ded-478a-4530-b940-386286702048] Updating instance_info_cache with network_info: [{"id": "aa68d5f0-3302-4547-984f-1d67a625a3ca", "address": "fa:16:3e:73:9d:b5", "network": {"id": "033447c9-ca96-4ca5-acc7-ad331d809250", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1285164177-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "7d38ee4d80bc4d1f87f4cb1f3907fb54", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d4b43a78-f49b-4132-ab2e-6e28769a9498", "external-id": "nsx-vlan-transportzone-737", "segmentation_id": 737, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapaa68d5f0-33", "ovs_interfaceid": "aa68d5f0-3302-4547-984f-1d67a625a3ca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 645.106799] env[68906]: DEBUG oslo_concurrency.lockutils [req-65ffb53b-396c-44ac-b193-d580ff975102 req-0f425d44-ee40-480d-b8b5-3d54430e692c service nova] Releasing lock "refresh_cache-d2258ded-478a-4530-b940-386286702048" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 645.107257] env[68906]: DEBUG nova.compute.manager [req-65ffb53b-396c-44ac-b193-d580ff975102 req-0f425d44-ee40-480d-b8b5-3d54430e692c service nova] [instance: da0c4340-a657-43bd-9a98-4c8f50add720] Received event network-vif-plugged-eb5a0594-e1fd-4786-8944-e71fc85436cb {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 645.107553] env[68906]: DEBUG oslo_concurrency.lockutils [req-65ffb53b-396c-44ac-b193-d580ff975102 req-0f425d44-ee40-480d-b8b5-3d54430e692c service nova] Acquiring lock "da0c4340-a657-43bd-9a98-4c8f50add720-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 645.107901] env[68906]: DEBUG oslo_concurrency.lockutils [req-65ffb53b-396c-44ac-b193-d580ff975102 req-0f425d44-ee40-480d-b8b5-3d54430e692c service nova] Lock "da0c4340-a657-43bd-9a98-4c8f50add720-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 645.108217] env[68906]: DEBUG oslo_concurrency.lockutils [req-65ffb53b-396c-44ac-b193-d580ff975102 req-0f425d44-ee40-480d-b8b5-3d54430e692c service nova] Lock "da0c4340-a657-43bd-9a98-4c8f50add720-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 645.108499] env[68906]: DEBUG nova.compute.manager [req-65ffb53b-396c-44ac-b193-d580ff975102 req-0f425d44-ee40-480d-b8b5-3d54430e692c service nova] [instance: da0c4340-a657-43bd-9a98-4c8f50add720] No waiting events found dispatching network-vif-plugged-eb5a0594-e1fd-4786-8944-e71fc85436cb {{(pid=68906) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 645.109248] env[68906]: WARNING nova.compute.manager [req-65ffb53b-396c-44ac-b193-d580ff975102 req-0f425d44-ee40-480d-b8b5-3d54430e692c service nova] [instance: da0c4340-a657-43bd-9a98-4c8f50add720] Received unexpected event network-vif-plugged-eb5a0594-e1fd-4786-8944-e71fc85436cb for instance with vm_state building and task_state spawning. [ 645.109868] env[68906]: DEBUG nova.compute.manager [req-65ffb53b-396c-44ac-b193-d580ff975102 req-0f425d44-ee40-480d-b8b5-3d54430e692c service nova] [instance: da0c4340-a657-43bd-9a98-4c8f50add720] Received event network-changed-eb5a0594-e1fd-4786-8944-e71fc85436cb {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 645.110270] env[68906]: DEBUG nova.compute.manager [req-65ffb53b-396c-44ac-b193-d580ff975102 req-0f425d44-ee40-480d-b8b5-3d54430e692c service nova] [instance: da0c4340-a657-43bd-9a98-4c8f50add720] Refreshing instance network info cache due to event network-changed-eb5a0594-e1fd-4786-8944-e71fc85436cb. {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 645.110570] env[68906]: DEBUG oslo_concurrency.lockutils [req-65ffb53b-396c-44ac-b193-d580ff975102 req-0f425d44-ee40-480d-b8b5-3d54430e692c service nova] Acquiring lock "refresh_cache-da0c4340-a657-43bd-9a98-4c8f50add720" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 645.110811] env[68906]: DEBUG oslo_concurrency.lockutils [req-65ffb53b-396c-44ac-b193-d580ff975102 req-0f425d44-ee40-480d-b8b5-3d54430e692c service nova] Acquired lock "refresh_cache-da0c4340-a657-43bd-9a98-4c8f50add720" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 645.111095] env[68906]: DEBUG nova.network.neutron [req-65ffb53b-396c-44ac-b193-d580ff975102 req-0f425d44-ee40-480d-b8b5-3d54430e692c service nova] [instance: da0c4340-a657-43bd-9a98-4c8f50add720] Refreshing network info cache for port eb5a0594-e1fd-4786-8944-e71fc85436cb {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 645.255859] env[68906]: DEBUG nova.network.neutron [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Successfully updated port: 57b48c3e-57a8-4ee0-a974-0813b3871e35 {{(pid=68906) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 645.265357] env[68906]: DEBUG oslo_concurrency.lockutils [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Acquiring lock "refresh_cache-4edb8b9f-b608-4be8-bfd3-65642710f9bd" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 645.265516] env[68906]: DEBUG oslo_concurrency.lockutils [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Acquired lock "refresh_cache-4edb8b9f-b608-4be8-bfd3-65642710f9bd" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 645.265665] env[68906]: DEBUG nova.network.neutron [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Building network info cache for instance {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 645.331568] env[68906]: DEBUG nova.network.neutron [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Instance cache missing network info. {{(pid=68906) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 646.093497] env[68906]: DEBUG nova.network.neutron [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Successfully created port: e312579f-5726-4546-9715-ecd7469d54fc {{(pid=68906) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 646.100132] env[68906]: DEBUG oslo_concurrency.lockutils [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Acquiring lock "ce63789a-1f0f-40ca-8368-ac3f84bb58cd" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 646.100267] env[68906]: DEBUG oslo_concurrency.lockutils [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Lock "ce63789a-1f0f-40ca-8368-ac3f84bb58cd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 646.125041] env[68906]: DEBUG nova.compute.manager [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 646.207221] env[68906]: DEBUG oslo_concurrency.lockutils [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 646.207221] env[68906]: DEBUG oslo_concurrency.lockutils [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 646.208291] env[68906]: INFO nova.compute.claims [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 646.277804] env[68906]: DEBUG nova.network.neutron [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Updating instance_info_cache with network_info: [{"id": "57b48c3e-57a8-4ee0-a974-0813b3871e35", "address": "fa:16:3e:97:e8:13", "network": {"id": "c9025f67-c9f7-4312-b2bd-5fbb06647b07", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-9371784-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0e206dedfb584e219a7f5dd633032515", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f16a5584-aed0-4df4-820b-5e7f15977265", "external-id": "cl2-zone-495", "segmentation_id": 495, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap57b48c3e-57", "ovs_interfaceid": "57b48c3e-57a8-4ee0-a974-0813b3871e35", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 646.279634] env[68906]: DEBUG nova.network.neutron [req-65ffb53b-396c-44ac-b193-d580ff975102 req-0f425d44-ee40-480d-b8b5-3d54430e692c service nova] [instance: da0c4340-a657-43bd-9a98-4c8f50add720] Updated VIF entry in instance network info cache for port eb5a0594-e1fd-4786-8944-e71fc85436cb. {{(pid=68906) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 646.279923] env[68906]: DEBUG nova.network.neutron [req-65ffb53b-396c-44ac-b193-d580ff975102 req-0f425d44-ee40-480d-b8b5-3d54430e692c service nova] [instance: da0c4340-a657-43bd-9a98-4c8f50add720] Updating instance_info_cache with network_info: [{"id": "eb5a0594-e1fd-4786-8944-e71fc85436cb", "address": "fa:16:3e:48:ee:36", "network": {"id": "63efabfb-0028-4758-9626-5f9860440121", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.209", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "1ae7bf3a375d41c6af5e7536af51ffd1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "69054a13-b7ef-44e1-bd3b-3ca5ba602848", "external-id": "nsx-vlan-transportzone-153", "segmentation_id": 153, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapeb5a0594-e1", "ovs_interfaceid": "eb5a0594-e1fd-4786-8944-e71fc85436cb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 646.297398] env[68906]: DEBUG oslo_concurrency.lockutils [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Releasing lock "refresh_cache-4edb8b9f-b608-4be8-bfd3-65642710f9bd" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 646.297696] env[68906]: DEBUG nova.compute.manager [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Instance network_info: |[{"id": "57b48c3e-57a8-4ee0-a974-0813b3871e35", "address": "fa:16:3e:97:e8:13", "network": {"id": "c9025f67-c9f7-4312-b2bd-5fbb06647b07", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-9371784-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0e206dedfb584e219a7f5dd633032515", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f16a5584-aed0-4df4-820b-5e7f15977265", "external-id": "cl2-zone-495", "segmentation_id": 495, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap57b48c3e-57", "ovs_interfaceid": "57b48c3e-57a8-4ee0-a974-0813b3871e35", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 646.298098] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:97:e8:13', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'f16a5584-aed0-4df4-820b-5e7f15977265', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '57b48c3e-57a8-4ee0-a974-0813b3871e35', 'vif_model': 'vmxnet3'}] {{(pid=68906) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 646.305507] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Creating folder: Project (0e206dedfb584e219a7f5dd633032515). Parent ref: group-v694750. {{(pid=68906) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 646.306327] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-8029b515-5b92-4b31-af1e-a95316bd4199 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 646.308962] env[68906]: DEBUG oslo_concurrency.lockutils [req-65ffb53b-396c-44ac-b193-d580ff975102 req-0f425d44-ee40-480d-b8b5-3d54430e692c service nova] Releasing lock "refresh_cache-da0c4340-a657-43bd-9a98-4c8f50add720" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 646.323288] env[68906]: INFO nova.virt.vmwareapi.vm_util [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Created folder: Project (0e206dedfb584e219a7f5dd633032515) in parent group-v694750. [ 646.323288] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Creating folder: Instances. Parent ref: group-v694769. {{(pid=68906) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 646.323650] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-2a580364-e42a-48dd-9451-b849db2866a1 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 646.334697] env[68906]: INFO nova.virt.vmwareapi.vm_util [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Created folder: Instances in parent group-v694769. [ 646.334697] env[68906]: DEBUG oslo.service.loopingcall [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 646.334697] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Creating VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 646.334947] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-4edea928-cef5-4fbe-96f1-0bb8d86e25d5 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 646.362118] env[68906]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 646.362118] env[68906]: value = "task-3475272" [ 646.362118] env[68906]: _type = "Task" [ 646.362118] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 646.367746] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475272, 'name': CreateVM_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 646.423504] env[68906]: DEBUG nova.network.neutron [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Successfully updated port: 390f6aeb-46ca-4723-b36d-949492fa4618 {{(pid=68906) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 646.441871] env[68906]: DEBUG oslo_concurrency.lockutils [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Acquiring lock "refresh_cache-d6ca51b9-b284-405c-878e-fdbc326b73e1" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 646.442077] env[68906]: DEBUG oslo_concurrency.lockutils [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Acquired lock "refresh_cache-d6ca51b9-b284-405c-878e-fdbc326b73e1" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 646.442237] env[68906]: DEBUG nova.network.neutron [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Building network info cache for instance {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 646.491674] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e7b732a-c896-4d95-ae38-7c0c848b5fef {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 646.499208] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-77a9f420-f221-44af-8631-30b9264dbfa8 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 646.533489] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d4fbb2e9-8fac-4416-9e13-04871abbdfcf {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 646.541937] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6b96ca1f-5e86-4a2a-89c9-ba6f3f194819 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 646.556404] env[68906]: DEBUG nova.compute.provider_tree [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 646.566264] env[68906]: DEBUG nova.scheduler.client.report [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 646.583841] env[68906]: DEBUG oslo_concurrency.lockutils [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.377s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 646.584429] env[68906]: DEBUG nova.compute.manager [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Start building networks asynchronously for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 646.626086] env[68906]: DEBUG nova.compute.utils [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Using /dev/sd instead of None {{(pid=68906) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 646.627655] env[68906]: DEBUG nova.compute.manager [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Allocating IP information in the background. {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 646.627826] env[68906]: DEBUG nova.network.neutron [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] allocate_for_instance() {{(pid=68906) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 646.637712] env[68906]: DEBUG nova.compute.manager [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Start building block device mappings for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 646.712019] env[68906]: DEBUG nova.policy [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '174d65a2d4ea40528b2b5c30986e3e81', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '38f7116c64254c4ca65c358856b9b0e5', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68906) authorize /opt/stack/nova/nova/policy.py:203}} [ 646.721630] env[68906]: DEBUG nova.compute.manager [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Start spawning the instance on the hypervisor. {{(pid=68906) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 646.754833] env[68906]: DEBUG nova.virt.hardware [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T13:00:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T13:00:23Z,direct_url=,disk_format='vmdk',id=b1400c31-d33b-4e13-944f-4c645e62493e,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='1ae7bf3a375d41c6af5e7536af51ffd1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T13:00:24Z,virtual_size=,visibility=), allow threads: False {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 646.755120] env[68906]: DEBUG nova.virt.hardware [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Flavor limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 646.755284] env[68906]: DEBUG nova.virt.hardware [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Image limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 646.755756] env[68906]: DEBUG nova.virt.hardware [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Flavor pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 646.755756] env[68906]: DEBUG nova.virt.hardware [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Image pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 646.755756] env[68906]: DEBUG nova.virt.hardware [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 646.755960] env[68906]: DEBUG nova.virt.hardware [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 646.756440] env[68906]: DEBUG nova.virt.hardware [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 646.756828] env[68906]: DEBUG nova.virt.hardware [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Got 1 possible topologies {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 646.756828] env[68906]: DEBUG nova.virt.hardware [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 646.756960] env[68906]: DEBUG nova.virt.hardware [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 646.759129] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1f559199-dd35-4571-88bb-2d6074b0e6b6 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 646.769118] env[68906]: DEBUG nova.network.neutron [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Instance cache missing network info. {{(pid=68906) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 646.773792] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-efc04bbe-2642-42c7-beb4-d768f5eb97aa {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 646.871071] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475272, 'name': CreateVM_Task, 'duration_secs': 0.324633} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 646.871469] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Created VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 646.872271] env[68906]: DEBUG oslo_concurrency.lockutils [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 646.872804] env[68906]: DEBUG oslo_concurrency.lockutils [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 646.873326] env[68906]: DEBUG oslo_concurrency.lockutils [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 646.873714] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f9cbc098-5332-40b5-a8fd-9049e9d1c42e {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 646.879200] env[68906]: DEBUG oslo_vmware.api [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Waiting for the task: (returnval){ [ 646.879200] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]52010ac2-f1f2-1158-afa4-447df955f71d" [ 646.879200] env[68906]: _type = "Task" [ 646.879200] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 646.887942] env[68906]: DEBUG oslo_vmware.api [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]52010ac2-f1f2-1158-afa4-447df955f71d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 647.095479] env[68906]: DEBUG nova.network.neutron [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Successfully created port: 81733f57-0e3f-4d59-8413-61c98911ea1f {{(pid=68906) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 647.393855] env[68906]: DEBUG oslo_concurrency.lockutils [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 647.394595] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Processing image b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 647.394908] env[68906]: DEBUG oslo_concurrency.lockutils [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 647.468301] env[68906]: DEBUG nova.network.neutron [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Updating instance_info_cache with network_info: [{"id": "390f6aeb-46ca-4723-b36d-949492fa4618", "address": "fa:16:3e:23:42:77", "network": {"id": "da6ba094-8e2a-4f76-813c-8668f482685b", "bridge": "br-int", "label": "tempest-ServersTestJSON-512380607-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "90f212f7916446919081fcdc0527ebb0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0cd5d325-3053-407e-a4ee-f627e82a23f9", "external-id": "nsx-vlan-transportzone-809", "segmentation_id": 809, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap390f6aeb-46", "ovs_interfaceid": "390f6aeb-46ca-4723-b36d-949492fa4618", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 647.487999] env[68906]: DEBUG oslo_concurrency.lockutils [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Releasing lock "refresh_cache-d6ca51b9-b284-405c-878e-fdbc326b73e1" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 647.488339] env[68906]: DEBUG nova.compute.manager [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Instance network_info: |[{"id": "390f6aeb-46ca-4723-b36d-949492fa4618", "address": "fa:16:3e:23:42:77", "network": {"id": "da6ba094-8e2a-4f76-813c-8668f482685b", "bridge": "br-int", "label": "tempest-ServersTestJSON-512380607-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "90f212f7916446919081fcdc0527ebb0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0cd5d325-3053-407e-a4ee-f627e82a23f9", "external-id": "nsx-vlan-transportzone-809", "segmentation_id": 809, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap390f6aeb-46", "ovs_interfaceid": "390f6aeb-46ca-4723-b36d-949492fa4618", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 647.488734] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:23:42:77', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '0cd5d325-3053-407e-a4ee-f627e82a23f9', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '390f6aeb-46ca-4723-b36d-949492fa4618', 'vif_model': 'vmxnet3'}] {{(pid=68906) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 647.498233] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Creating folder: Project (90f212f7916446919081fcdc0527ebb0). Parent ref: group-v694750. {{(pid=68906) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 647.498929] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-fdfd675e-bdfc-4c5f-98cc-3668a5ed8cfa {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 647.510214] env[68906]: INFO nova.virt.vmwareapi.vm_util [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Created folder: Project (90f212f7916446919081fcdc0527ebb0) in parent group-v694750. [ 647.510455] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Creating folder: Instances. Parent ref: group-v694775. {{(pid=68906) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 647.510818] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-ba826cb0-4377-43a9-8766-f17010f9b0ac {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 647.521893] env[68906]: INFO nova.virt.vmwareapi.vm_util [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Created folder: Instances in parent group-v694775. [ 647.522107] env[68906]: DEBUG oslo.service.loopingcall [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 647.522247] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Creating VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 647.522531] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-9fd3d1ad-803e-487a-916b-c59bda7db47d {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 647.544313] env[68906]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 647.544313] env[68906]: value = "task-3475279" [ 647.544313] env[68906]: _type = "Task" [ 647.544313] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 647.554693] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475279, 'name': CreateVM_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 647.997297] env[68906]: DEBUG oslo_concurrency.lockutils [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Acquiring lock "9a2d2803-34b1-40f7-9349-e5734a217e18" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 647.998121] env[68906]: DEBUG oslo_concurrency.lockutils [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Lock "9a2d2803-34b1-40f7-9349-e5734a217e18" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.002s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 648.057430] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475279, 'name': CreateVM_Task, 'duration_secs': 0.487869} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 648.057567] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Created VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 648.061482] env[68906]: DEBUG oslo_concurrency.lockutils [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 648.061655] env[68906]: DEBUG oslo_concurrency.lockutils [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 648.061989] env[68906]: DEBUG oslo_concurrency.lockutils [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 648.062678] env[68906]: DEBUG oslo_concurrency.lockutils [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Acquiring lock "13eebe4e-5984-46c3-bb73-cd783ad45df6" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 648.062880] env[68906]: DEBUG oslo_concurrency.lockutils [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Lock "13eebe4e-5984-46c3-bb73-cd783ad45df6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 648.063369] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-704ed836-44a2-446a-b632-3bec9e75aeaa {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 648.071560] env[68906]: DEBUG oslo_vmware.api [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Waiting for the task: (returnval){ [ 648.071560] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]52294daf-851a-228e-4ba9-73740492b069" [ 648.071560] env[68906]: _type = "Task" [ 648.071560] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 648.081884] env[68906]: DEBUG oslo_vmware.api [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]52294daf-851a-228e-4ba9-73740492b069, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 648.100948] env[68906]: DEBUG nova.network.neutron [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Successfully updated port: 81733f57-0e3f-4d59-8413-61c98911ea1f {{(pid=68906) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 648.121760] env[68906]: DEBUG oslo_concurrency.lockutils [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Acquiring lock "refresh_cache-ce63789a-1f0f-40ca-8368-ac3f84bb58cd" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 648.121760] env[68906]: DEBUG oslo_concurrency.lockutils [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Acquired lock "refresh_cache-ce63789a-1f0f-40ca-8368-ac3f84bb58cd" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 648.121760] env[68906]: DEBUG nova.network.neutron [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Building network info cache for instance {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 648.184031] env[68906]: DEBUG nova.network.neutron [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Instance cache missing network info. {{(pid=68906) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 648.423274] env[68906]: DEBUG nova.network.neutron [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Successfully updated port: e312579f-5726-4546-9715-ecd7469d54fc {{(pid=68906) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 648.439347] env[68906]: DEBUG nova.network.neutron [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Updating instance_info_cache with network_info: [{"id": "81733f57-0e3f-4d59-8413-61c98911ea1f", "address": "fa:16:3e:8f:e8:0d", "network": {"id": "b3851e79-234a-4664-ac56-9020c209bcd1", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1765566724-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "38f7116c64254c4ca65c358856b9b0e5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d6e940e5-e083-4238-973e-f1b4e2a3a5c7", "external-id": "nsx-vlan-transportzone-64", "segmentation_id": 64, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap81733f57-0e", "ovs_interfaceid": "81733f57-0e3f-4d59-8413-61c98911ea1f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 648.443603] env[68906]: DEBUG oslo_concurrency.lockutils [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Acquiring lock "refresh_cache-f42056e5-52cb-4d69-8022-ca643c49194e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 648.443749] env[68906]: DEBUG oslo_concurrency.lockutils [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Acquired lock "refresh_cache-f42056e5-52cb-4d69-8022-ca643c49194e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 648.443893] env[68906]: DEBUG nova.network.neutron [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Building network info cache for instance {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 648.464581] env[68906]: DEBUG oslo_concurrency.lockutils [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Releasing lock "refresh_cache-ce63789a-1f0f-40ca-8368-ac3f84bb58cd" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 648.465815] env[68906]: DEBUG nova.compute.manager [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Instance network_info: |[{"id": "81733f57-0e3f-4d59-8413-61c98911ea1f", "address": "fa:16:3e:8f:e8:0d", "network": {"id": "b3851e79-234a-4664-ac56-9020c209bcd1", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1765566724-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "38f7116c64254c4ca65c358856b9b0e5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d6e940e5-e083-4238-973e-f1b4e2a3a5c7", "external-id": "nsx-vlan-transportzone-64", "segmentation_id": 64, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap81733f57-0e", "ovs_interfaceid": "81733f57-0e3f-4d59-8413-61c98911ea1f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 648.470753] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:8f:e8:0d', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'd6e940e5-e083-4238-973e-f1b4e2a3a5c7', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '81733f57-0e3f-4d59-8413-61c98911ea1f', 'vif_model': 'vmxnet3'}] {{(pid=68906) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 648.485530] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Creating folder: Project (38f7116c64254c4ca65c358856b9b0e5). Parent ref: group-v694750. {{(pid=68906) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 648.486455] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-0b2859c1-5ecc-40d6-a512-907c73fc2875 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 648.499708] env[68906]: INFO nova.virt.vmwareapi.vm_util [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Created folder: Project (38f7116c64254c4ca65c358856b9b0e5) in parent group-v694750. [ 648.500058] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Creating folder: Instances. Parent ref: group-v694778. {{(pid=68906) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 648.501415] env[68906]: DEBUG nova.network.neutron [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Instance cache missing network info. {{(pid=68906) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 648.504867] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b8854d1b-26cf-4771-9c99-db972890b22d {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 648.517990] env[68906]: INFO nova.virt.vmwareapi.vm_util [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Created folder: Instances in parent group-v694778. [ 648.518472] env[68906]: DEBUG oslo.service.loopingcall [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 648.518767] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Creating VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 648.519885] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-3e69d8d4-4c14-40f0-a5cd-4d3720f80cc9 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 648.549927] env[68906]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 648.549927] env[68906]: value = "task-3475282" [ 648.549927] env[68906]: _type = "Task" [ 648.549927] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 648.559154] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475282, 'name': CreateVM_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 648.587859] env[68906]: DEBUG oslo_concurrency.lockutils [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 648.588442] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Processing image b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 648.588681] env[68906]: DEBUG oslo_concurrency.lockutils [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 648.726277] env[68906]: DEBUG nova.network.neutron [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Updating instance_info_cache with network_info: [{"id": "e312579f-5726-4546-9715-ecd7469d54fc", "address": "fa:16:3e:ef:c0:3f", "network": {"id": "a9fd09ac-36e9-4c8d-83bd-4e2704c839d6", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1118120170-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3d3cc4c86bc14a69a001ef23df615f2c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "40c947c4-f471-4d48-8e43-fee54198107e", "external-id": "nsx-vlan-transportzone-203", "segmentation_id": 203, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape312579f-57", "ovs_interfaceid": "e312579f-5726-4546-9715-ecd7469d54fc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 648.744062] env[68906]: DEBUG oslo_concurrency.lockutils [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Releasing lock "refresh_cache-f42056e5-52cb-4d69-8022-ca643c49194e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 648.744321] env[68906]: DEBUG nova.compute.manager [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Instance network_info: |[{"id": "e312579f-5726-4546-9715-ecd7469d54fc", "address": "fa:16:3e:ef:c0:3f", "network": {"id": "a9fd09ac-36e9-4c8d-83bd-4e2704c839d6", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1118120170-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3d3cc4c86bc14a69a001ef23df615f2c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "40c947c4-f471-4d48-8e43-fee54198107e", "external-id": "nsx-vlan-transportzone-203", "segmentation_id": 203, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape312579f-57", "ovs_interfaceid": "e312579f-5726-4546-9715-ecd7469d54fc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 648.744707] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ef:c0:3f', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '40c947c4-f471-4d48-8e43-fee54198107e', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'e312579f-5726-4546-9715-ecd7469d54fc', 'vif_model': 'vmxnet3'}] {{(pid=68906) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 648.753341] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Creating folder: Project (3d3cc4c86bc14a69a001ef23df615f2c). Parent ref: group-v694750. {{(pid=68906) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 648.754293] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-5d7c375c-69fb-45c8-9522-60d60cbf8444 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 648.768042] env[68906]: INFO nova.virt.vmwareapi.vm_util [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Created folder: Project (3d3cc4c86bc14a69a001ef23df615f2c) in parent group-v694750. [ 648.768455] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Creating folder: Instances. Parent ref: group-v694781. {{(pid=68906) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 648.768542] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b923ef0f-68d5-4b81-9619-972d1141dd8f {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 648.780841] env[68906]: INFO nova.virt.vmwareapi.vm_util [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Created folder: Instances in parent group-v694781. [ 648.781071] env[68906]: DEBUG oslo.service.loopingcall [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 648.781268] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Creating VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 648.781475] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-38a3fbc1-35e7-496c-9b1f-869b0e330e84 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 648.801672] env[68906]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 648.801672] env[68906]: value = "task-3475285" [ 648.801672] env[68906]: _type = "Task" [ 648.801672] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 648.809090] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475285, 'name': CreateVM_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 648.874120] env[68906]: DEBUG nova.compute.manager [req-a9e6353d-9be0-4f7f-9768-7e4fee89dbc5 req-1f34a94f-ad46-4a01-acde-e3c029cd6c83 service nova] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Received event network-vif-plugged-390f6aeb-46ca-4723-b36d-949492fa4618 {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 648.874620] env[68906]: DEBUG oslo_concurrency.lockutils [req-a9e6353d-9be0-4f7f-9768-7e4fee89dbc5 req-1f34a94f-ad46-4a01-acde-e3c029cd6c83 service nova] Acquiring lock "d6ca51b9-b284-405c-878e-fdbc326b73e1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 648.874620] env[68906]: DEBUG oslo_concurrency.lockutils [req-a9e6353d-9be0-4f7f-9768-7e4fee89dbc5 req-1f34a94f-ad46-4a01-acde-e3c029cd6c83 service nova] Lock "d6ca51b9-b284-405c-878e-fdbc326b73e1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 648.874796] env[68906]: DEBUG oslo_concurrency.lockutils [req-a9e6353d-9be0-4f7f-9768-7e4fee89dbc5 req-1f34a94f-ad46-4a01-acde-e3c029cd6c83 service nova] Lock "d6ca51b9-b284-405c-878e-fdbc326b73e1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 648.874949] env[68906]: DEBUG nova.compute.manager [req-a9e6353d-9be0-4f7f-9768-7e4fee89dbc5 req-1f34a94f-ad46-4a01-acde-e3c029cd6c83 service nova] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] No waiting events found dispatching network-vif-plugged-390f6aeb-46ca-4723-b36d-949492fa4618 {{(pid=68906) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 648.875854] env[68906]: WARNING nova.compute.manager [req-a9e6353d-9be0-4f7f-9768-7e4fee89dbc5 req-1f34a94f-ad46-4a01-acde-e3c029cd6c83 service nova] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Received unexpected event network-vif-plugged-390f6aeb-46ca-4723-b36d-949492fa4618 for instance with vm_state building and task_state spawning. [ 649.061776] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475282, 'name': CreateVM_Task} progress is 99%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 649.316300] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475285, 'name': CreateVM_Task} progress is 25%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 649.341707] env[68906]: DEBUG nova.compute.manager [req-c3a7fe75-bd00-4b01-aa40-b652887c04b4 req-746870bb-3ead-47bf-b96e-458a16343c9f service nova] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] Received event network-changed-f99c8fa6-99d5-43ed-b528-9d2b22675c2b {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 649.342267] env[68906]: DEBUG nova.compute.manager [req-c3a7fe75-bd00-4b01-aa40-b652887c04b4 req-746870bb-3ead-47bf-b96e-458a16343c9f service nova] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] Refreshing instance network info cache due to event network-changed-f99c8fa6-99d5-43ed-b528-9d2b22675c2b. {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 649.342267] env[68906]: DEBUG oslo_concurrency.lockutils [req-c3a7fe75-bd00-4b01-aa40-b652887c04b4 req-746870bb-3ead-47bf-b96e-458a16343c9f service nova] Acquiring lock "refresh_cache-0540a4dc-1b86-4776-b633-f540af168a2b" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 649.342506] env[68906]: DEBUG oslo_concurrency.lockutils [req-c3a7fe75-bd00-4b01-aa40-b652887c04b4 req-746870bb-3ead-47bf-b96e-458a16343c9f service nova] Acquired lock "refresh_cache-0540a4dc-1b86-4776-b633-f540af168a2b" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 649.342651] env[68906]: DEBUG nova.network.neutron [req-c3a7fe75-bd00-4b01-aa40-b652887c04b4 req-746870bb-3ead-47bf-b96e-458a16343c9f service nova] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] Refreshing network info cache for port f99c8fa6-99d5-43ed-b528-9d2b22675c2b {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 649.567592] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475282, 'name': CreateVM_Task, 'duration_secs': 0.539338} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 649.568123] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Created VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 649.569464] env[68906]: DEBUG oslo_concurrency.lockutils [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 649.569690] env[68906]: DEBUG oslo_concurrency.lockutils [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 649.570177] env[68906]: DEBUG oslo_concurrency.lockutils [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 649.570501] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-39453179-5c42-442d-bd73-a6fc8fd1f48b {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 649.577282] env[68906]: DEBUG oslo_vmware.api [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Waiting for the task: (returnval){ [ 649.577282] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]52bb6e64-01c2-02fb-a9ff-776140093016" [ 649.577282] env[68906]: _type = "Task" [ 649.577282] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 649.588725] env[68906]: DEBUG oslo_vmware.api [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]52bb6e64-01c2-02fb-a9ff-776140093016, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 649.712138] env[68906]: DEBUG nova.network.neutron [req-c3a7fe75-bd00-4b01-aa40-b652887c04b4 req-746870bb-3ead-47bf-b96e-458a16343c9f service nova] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] Updated VIF entry in instance network info cache for port f99c8fa6-99d5-43ed-b528-9d2b22675c2b. {{(pid=68906) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 649.712481] env[68906]: DEBUG nova.network.neutron [req-c3a7fe75-bd00-4b01-aa40-b652887c04b4 req-746870bb-3ead-47bf-b96e-458a16343c9f service nova] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] Updating instance_info_cache with network_info: [{"id": "f99c8fa6-99d5-43ed-b528-9d2b22675c2b", "address": "fa:16:3e:48:85:3d", "network": {"id": "0b2fac70-58a4-4e6b-88d2-37fe26c225da", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-11552093-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "5c2305d3a9b443f29494e5e234e0f492", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "1a8c8175-1197-4f12-baac-ef6aba95f585", "external-id": "nsx-vlan-transportzone-832", "segmentation_id": 832, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf99c8fa6-99", "ovs_interfaceid": "f99c8fa6-99d5-43ed-b528-9d2b22675c2b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 649.724839] env[68906]: DEBUG oslo_concurrency.lockutils [req-c3a7fe75-bd00-4b01-aa40-b652887c04b4 req-746870bb-3ead-47bf-b96e-458a16343c9f service nova] Releasing lock "refresh_cache-0540a4dc-1b86-4776-b633-f540af168a2b" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 649.725121] env[68906]: DEBUG nova.compute.manager [req-c3a7fe75-bd00-4b01-aa40-b652887c04b4 req-746870bb-3ead-47bf-b96e-458a16343c9f service nova] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Received event network-vif-plugged-57b48c3e-57a8-4ee0-a974-0813b3871e35 {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 649.725316] env[68906]: DEBUG oslo_concurrency.lockutils [req-c3a7fe75-bd00-4b01-aa40-b652887c04b4 req-746870bb-3ead-47bf-b96e-458a16343c9f service nova] Acquiring lock "4edb8b9f-b608-4be8-bfd3-65642710f9bd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 649.725519] env[68906]: DEBUG oslo_concurrency.lockutils [req-c3a7fe75-bd00-4b01-aa40-b652887c04b4 req-746870bb-3ead-47bf-b96e-458a16343c9f service nova] Lock "4edb8b9f-b608-4be8-bfd3-65642710f9bd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 649.725676] env[68906]: DEBUG oslo_concurrency.lockutils [req-c3a7fe75-bd00-4b01-aa40-b652887c04b4 req-746870bb-3ead-47bf-b96e-458a16343c9f service nova] Lock "4edb8b9f-b608-4be8-bfd3-65642710f9bd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 649.725835] env[68906]: DEBUG nova.compute.manager [req-c3a7fe75-bd00-4b01-aa40-b652887c04b4 req-746870bb-3ead-47bf-b96e-458a16343c9f service nova] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] No waiting events found dispatching network-vif-plugged-57b48c3e-57a8-4ee0-a974-0813b3871e35 {{(pid=68906) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 649.726000] env[68906]: WARNING nova.compute.manager [req-c3a7fe75-bd00-4b01-aa40-b652887c04b4 req-746870bb-3ead-47bf-b96e-458a16343c9f service nova] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Received unexpected event network-vif-plugged-57b48c3e-57a8-4ee0-a974-0813b3871e35 for instance with vm_state building and task_state spawning. [ 649.726212] env[68906]: DEBUG nova.compute.manager [req-c3a7fe75-bd00-4b01-aa40-b652887c04b4 req-746870bb-3ead-47bf-b96e-458a16343c9f service nova] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Received event network-changed-57b48c3e-57a8-4ee0-a974-0813b3871e35 {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 649.726376] env[68906]: DEBUG nova.compute.manager [req-c3a7fe75-bd00-4b01-aa40-b652887c04b4 req-746870bb-3ead-47bf-b96e-458a16343c9f service nova] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Refreshing instance network info cache due to event network-changed-57b48c3e-57a8-4ee0-a974-0813b3871e35. {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 649.726557] env[68906]: DEBUG oslo_concurrency.lockutils [req-c3a7fe75-bd00-4b01-aa40-b652887c04b4 req-746870bb-3ead-47bf-b96e-458a16343c9f service nova] Acquiring lock "refresh_cache-4edb8b9f-b608-4be8-bfd3-65642710f9bd" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 649.726826] env[68906]: DEBUG oslo_concurrency.lockutils [req-c3a7fe75-bd00-4b01-aa40-b652887c04b4 req-746870bb-3ead-47bf-b96e-458a16343c9f service nova] Acquired lock "refresh_cache-4edb8b9f-b608-4be8-bfd3-65642710f9bd" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 649.726902] env[68906]: DEBUG nova.network.neutron [req-c3a7fe75-bd00-4b01-aa40-b652887c04b4 req-746870bb-3ead-47bf-b96e-458a16343c9f service nova] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Refreshing network info cache for port 57b48c3e-57a8-4ee0-a974-0813b3871e35 {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 649.814642] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475285, 'name': CreateVM_Task, 'duration_secs': 0.622685} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 649.814642] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Created VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 649.814642] env[68906]: DEBUG oslo_concurrency.lockutils [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 650.024435] env[68906]: DEBUG nova.network.neutron [req-c3a7fe75-bd00-4b01-aa40-b652887c04b4 req-746870bb-3ead-47bf-b96e-458a16343c9f service nova] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Updated VIF entry in instance network info cache for port 57b48c3e-57a8-4ee0-a974-0813b3871e35. {{(pid=68906) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 650.024435] env[68906]: DEBUG nova.network.neutron [req-c3a7fe75-bd00-4b01-aa40-b652887c04b4 req-746870bb-3ead-47bf-b96e-458a16343c9f service nova] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Updating instance_info_cache with network_info: [{"id": "57b48c3e-57a8-4ee0-a974-0813b3871e35", "address": "fa:16:3e:97:e8:13", "network": {"id": "c9025f67-c9f7-4312-b2bd-5fbb06647b07", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-9371784-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0e206dedfb584e219a7f5dd633032515", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f16a5584-aed0-4df4-820b-5e7f15977265", "external-id": "cl2-zone-495", "segmentation_id": 495, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap57b48c3e-57", "ovs_interfaceid": "57b48c3e-57a8-4ee0-a974-0813b3871e35", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 650.036557] env[68906]: DEBUG oslo_concurrency.lockutils [req-c3a7fe75-bd00-4b01-aa40-b652887c04b4 req-746870bb-3ead-47bf-b96e-458a16343c9f service nova] Releasing lock "refresh_cache-4edb8b9f-b608-4be8-bfd3-65642710f9bd" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 650.090490] env[68906]: DEBUG oslo_concurrency.lockutils [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 650.090826] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Processing image b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 650.091108] env[68906]: DEBUG oslo_concurrency.lockutils [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 650.091346] env[68906]: DEBUG oslo_concurrency.lockutils [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 650.091651] env[68906]: DEBUG oslo_concurrency.lockutils [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 650.091979] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-27c8b3cb-a289-4df8-842e-37bb860b580d {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 650.098338] env[68906]: DEBUG oslo_vmware.api [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Waiting for the task: (returnval){ [ 650.098338] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]52d9a322-97ad-35c1-561f-f3a2d4d338eb" [ 650.098338] env[68906]: _type = "Task" [ 650.098338] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 650.108929] env[68906]: DEBUG oslo_vmware.api [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]52d9a322-97ad-35c1-561f-f3a2d4d338eb, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 650.607932] env[68906]: DEBUG oslo_concurrency.lockutils [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 650.608272] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Processing image b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 650.608405] env[68906]: DEBUG oslo_concurrency.lockutils [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 652.344110] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Acquiring lock "a7e0a28f-42a5-442e-b962-07771d2e6a27" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 652.344473] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Lock "a7e0a28f-42a5-442e-b962-07771d2e6a27" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 652.484261] env[68906]: DEBUG nova.compute.manager [req-84174de7-07cd-4ad0-b96e-0851d164d844 req-bf377c89-72ba-4d89-82c9-41c8afe97d0e service nova] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Received event network-changed-390f6aeb-46ca-4723-b36d-949492fa4618 {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 652.484562] env[68906]: DEBUG nova.compute.manager [req-84174de7-07cd-4ad0-b96e-0851d164d844 req-bf377c89-72ba-4d89-82c9-41c8afe97d0e service nova] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Refreshing instance network info cache due to event network-changed-390f6aeb-46ca-4723-b36d-949492fa4618. {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 652.484789] env[68906]: DEBUG oslo_concurrency.lockutils [req-84174de7-07cd-4ad0-b96e-0851d164d844 req-bf377c89-72ba-4d89-82c9-41c8afe97d0e service nova] Acquiring lock "refresh_cache-d6ca51b9-b284-405c-878e-fdbc326b73e1" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 652.485508] env[68906]: DEBUG oslo_concurrency.lockutils [req-84174de7-07cd-4ad0-b96e-0851d164d844 req-bf377c89-72ba-4d89-82c9-41c8afe97d0e service nova] Acquired lock "refresh_cache-d6ca51b9-b284-405c-878e-fdbc326b73e1" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 652.485508] env[68906]: DEBUG nova.network.neutron [req-84174de7-07cd-4ad0-b96e-0851d164d844 req-bf377c89-72ba-4d89-82c9-41c8afe97d0e service nova] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Refreshing network info cache for port 390f6aeb-46ca-4723-b36d-949492fa4618 {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 653.206605] env[68906]: DEBUG nova.network.neutron [req-84174de7-07cd-4ad0-b96e-0851d164d844 req-bf377c89-72ba-4d89-82c9-41c8afe97d0e service nova] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Updated VIF entry in instance network info cache for port 390f6aeb-46ca-4723-b36d-949492fa4618. {{(pid=68906) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 653.206954] env[68906]: DEBUG nova.network.neutron [req-84174de7-07cd-4ad0-b96e-0851d164d844 req-bf377c89-72ba-4d89-82c9-41c8afe97d0e service nova] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Updating instance_info_cache with network_info: [{"id": "390f6aeb-46ca-4723-b36d-949492fa4618", "address": "fa:16:3e:23:42:77", "network": {"id": "da6ba094-8e2a-4f76-813c-8668f482685b", "bridge": "br-int", "label": "tempest-ServersTestJSON-512380607-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "90f212f7916446919081fcdc0527ebb0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0cd5d325-3053-407e-a4ee-f627e82a23f9", "external-id": "nsx-vlan-transportzone-809", "segmentation_id": 809, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap390f6aeb-46", "ovs_interfaceid": "390f6aeb-46ca-4723-b36d-949492fa4618", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 653.220662] env[68906]: DEBUG oslo_concurrency.lockutils [req-84174de7-07cd-4ad0-b96e-0851d164d844 req-bf377c89-72ba-4d89-82c9-41c8afe97d0e service nova] Releasing lock "refresh_cache-d6ca51b9-b284-405c-878e-fdbc326b73e1" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 653.220923] env[68906]: DEBUG nova.compute.manager [req-84174de7-07cd-4ad0-b96e-0851d164d844 req-bf377c89-72ba-4d89-82c9-41c8afe97d0e service nova] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Received event network-vif-plugged-e312579f-5726-4546-9715-ecd7469d54fc {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 653.221258] env[68906]: DEBUG oslo_concurrency.lockutils [req-84174de7-07cd-4ad0-b96e-0851d164d844 req-bf377c89-72ba-4d89-82c9-41c8afe97d0e service nova] Acquiring lock "f42056e5-52cb-4d69-8022-ca643c49194e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 653.221324] env[68906]: DEBUG oslo_concurrency.lockutils [req-84174de7-07cd-4ad0-b96e-0851d164d844 req-bf377c89-72ba-4d89-82c9-41c8afe97d0e service nova] Lock "f42056e5-52cb-4d69-8022-ca643c49194e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 653.221486] env[68906]: DEBUG oslo_concurrency.lockutils [req-84174de7-07cd-4ad0-b96e-0851d164d844 req-bf377c89-72ba-4d89-82c9-41c8afe97d0e service nova] Lock "f42056e5-52cb-4d69-8022-ca643c49194e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 653.221645] env[68906]: DEBUG nova.compute.manager [req-84174de7-07cd-4ad0-b96e-0851d164d844 req-bf377c89-72ba-4d89-82c9-41c8afe97d0e service nova] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] No waiting events found dispatching network-vif-plugged-e312579f-5726-4546-9715-ecd7469d54fc {{(pid=68906) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 653.221805] env[68906]: WARNING nova.compute.manager [req-84174de7-07cd-4ad0-b96e-0851d164d844 req-bf377c89-72ba-4d89-82c9-41c8afe97d0e service nova] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Received unexpected event network-vif-plugged-e312579f-5726-4546-9715-ecd7469d54fc for instance with vm_state building and task_state spawning. [ 653.221973] env[68906]: DEBUG nova.compute.manager [req-84174de7-07cd-4ad0-b96e-0851d164d844 req-bf377c89-72ba-4d89-82c9-41c8afe97d0e service nova] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Received event network-changed-e312579f-5726-4546-9715-ecd7469d54fc {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 653.222128] env[68906]: DEBUG nova.compute.manager [req-84174de7-07cd-4ad0-b96e-0851d164d844 req-bf377c89-72ba-4d89-82c9-41c8afe97d0e service nova] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Refreshing instance network info cache due to event network-changed-e312579f-5726-4546-9715-ecd7469d54fc. {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 653.222308] env[68906]: DEBUG oslo_concurrency.lockutils [req-84174de7-07cd-4ad0-b96e-0851d164d844 req-bf377c89-72ba-4d89-82c9-41c8afe97d0e service nova] Acquiring lock "refresh_cache-f42056e5-52cb-4d69-8022-ca643c49194e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 653.223742] env[68906]: DEBUG oslo_concurrency.lockutils [req-84174de7-07cd-4ad0-b96e-0851d164d844 req-bf377c89-72ba-4d89-82c9-41c8afe97d0e service nova] Acquired lock "refresh_cache-f42056e5-52cb-4d69-8022-ca643c49194e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 653.223742] env[68906]: DEBUG nova.network.neutron [req-84174de7-07cd-4ad0-b96e-0851d164d844 req-bf377c89-72ba-4d89-82c9-41c8afe97d0e service nova] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Refreshing network info cache for port e312579f-5726-4546-9715-ecd7469d54fc {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 653.227678] env[68906]: DEBUG nova.compute.manager [req-45a82b2d-5f91-4568-b219-460c992c4ca0 req-d7246b39-12b5-4274-8f73-0859e631f619 service nova] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Received event network-vif-plugged-81733f57-0e3f-4d59-8413-61c98911ea1f {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 653.227823] env[68906]: DEBUG oslo_concurrency.lockutils [req-45a82b2d-5f91-4568-b219-460c992c4ca0 req-d7246b39-12b5-4274-8f73-0859e631f619 service nova] Acquiring lock "ce63789a-1f0f-40ca-8368-ac3f84bb58cd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 653.227981] env[68906]: DEBUG oslo_concurrency.lockutils [req-45a82b2d-5f91-4568-b219-460c992c4ca0 req-d7246b39-12b5-4274-8f73-0859e631f619 service nova] Lock "ce63789a-1f0f-40ca-8368-ac3f84bb58cd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 653.228186] env[68906]: DEBUG oslo_concurrency.lockutils [req-45a82b2d-5f91-4568-b219-460c992c4ca0 req-d7246b39-12b5-4274-8f73-0859e631f619 service nova] Lock "ce63789a-1f0f-40ca-8368-ac3f84bb58cd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 653.228513] env[68906]: DEBUG nova.compute.manager [req-45a82b2d-5f91-4568-b219-460c992c4ca0 req-d7246b39-12b5-4274-8f73-0859e631f619 service nova] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] No waiting events found dispatching network-vif-plugged-81733f57-0e3f-4d59-8413-61c98911ea1f {{(pid=68906) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 653.228513] env[68906]: WARNING nova.compute.manager [req-45a82b2d-5f91-4568-b219-460c992c4ca0 req-d7246b39-12b5-4274-8f73-0859e631f619 service nova] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Received unexpected event network-vif-plugged-81733f57-0e3f-4d59-8413-61c98911ea1f for instance with vm_state building and task_state spawning. [ 653.228921] env[68906]: DEBUG nova.compute.manager [req-45a82b2d-5f91-4568-b219-460c992c4ca0 req-d7246b39-12b5-4274-8f73-0859e631f619 service nova] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Received event network-changed-81733f57-0e3f-4d59-8413-61c98911ea1f {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 653.228921] env[68906]: DEBUG nova.compute.manager [req-45a82b2d-5f91-4568-b219-460c992c4ca0 req-d7246b39-12b5-4274-8f73-0859e631f619 service nova] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Refreshing instance network info cache due to event network-changed-81733f57-0e3f-4d59-8413-61c98911ea1f. {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 653.229073] env[68906]: DEBUG oslo_concurrency.lockutils [req-45a82b2d-5f91-4568-b219-460c992c4ca0 req-d7246b39-12b5-4274-8f73-0859e631f619 service nova] Acquiring lock "refresh_cache-ce63789a-1f0f-40ca-8368-ac3f84bb58cd" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 653.229157] env[68906]: DEBUG oslo_concurrency.lockutils [req-45a82b2d-5f91-4568-b219-460c992c4ca0 req-d7246b39-12b5-4274-8f73-0859e631f619 service nova] Acquired lock "refresh_cache-ce63789a-1f0f-40ca-8368-ac3f84bb58cd" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 653.229316] env[68906]: DEBUG nova.network.neutron [req-45a82b2d-5f91-4568-b219-460c992c4ca0 req-d7246b39-12b5-4274-8f73-0859e631f619 service nova] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Refreshing network info cache for port 81733f57-0e3f-4d59-8413-61c98911ea1f {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 653.745136] env[68906]: DEBUG nova.network.neutron [req-84174de7-07cd-4ad0-b96e-0851d164d844 req-bf377c89-72ba-4d89-82c9-41c8afe97d0e service nova] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Updated VIF entry in instance network info cache for port e312579f-5726-4546-9715-ecd7469d54fc. {{(pid=68906) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 653.749205] env[68906]: DEBUG nova.network.neutron [req-84174de7-07cd-4ad0-b96e-0851d164d844 req-bf377c89-72ba-4d89-82c9-41c8afe97d0e service nova] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Updating instance_info_cache with network_info: [{"id": "e312579f-5726-4546-9715-ecd7469d54fc", "address": "fa:16:3e:ef:c0:3f", "network": {"id": "a9fd09ac-36e9-4c8d-83bd-4e2704c839d6", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1118120170-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3d3cc4c86bc14a69a001ef23df615f2c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "40c947c4-f471-4d48-8e43-fee54198107e", "external-id": "nsx-vlan-transportzone-203", "segmentation_id": 203, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape312579f-57", "ovs_interfaceid": "e312579f-5726-4546-9715-ecd7469d54fc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 653.760606] env[68906]: DEBUG oslo_concurrency.lockutils [req-84174de7-07cd-4ad0-b96e-0851d164d844 req-bf377c89-72ba-4d89-82c9-41c8afe97d0e service nova] Releasing lock "refresh_cache-f42056e5-52cb-4d69-8022-ca643c49194e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 654.434508] env[68906]: DEBUG nova.network.neutron [req-45a82b2d-5f91-4568-b219-460c992c4ca0 req-d7246b39-12b5-4274-8f73-0859e631f619 service nova] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Updated VIF entry in instance network info cache for port 81733f57-0e3f-4d59-8413-61c98911ea1f. {{(pid=68906) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 654.434508] env[68906]: DEBUG nova.network.neutron [req-45a82b2d-5f91-4568-b219-460c992c4ca0 req-d7246b39-12b5-4274-8f73-0859e631f619 service nova] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Updating instance_info_cache with network_info: [{"id": "81733f57-0e3f-4d59-8413-61c98911ea1f", "address": "fa:16:3e:8f:e8:0d", "network": {"id": "b3851e79-234a-4664-ac56-9020c209bcd1", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1765566724-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "38f7116c64254c4ca65c358856b9b0e5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d6e940e5-e083-4238-973e-f1b4e2a3a5c7", "external-id": "nsx-vlan-transportzone-64", "segmentation_id": 64, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap81733f57-0e", "ovs_interfaceid": "81733f57-0e3f-4d59-8413-61c98911ea1f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 654.456404] env[68906]: DEBUG oslo_concurrency.lockutils [req-45a82b2d-5f91-4568-b219-460c992c4ca0 req-d7246b39-12b5-4274-8f73-0859e631f619 service nova] Releasing lock "refresh_cache-ce63789a-1f0f-40ca-8368-ac3f84bb58cd" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 655.157060] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Acquiring lock "eb81e9b1-b573-4d7c-9ede-f8b32a43a201" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 655.157332] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Lock "eb81e9b1-b573-4d7c-9ede-f8b32a43a201" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 659.082933] env[68906]: DEBUG oslo_concurrency.lockutils [None req-33e1c89b-3692-4ec9-abc0-4ab4dc3be0f7 tempest-ServersWithSpecificFlavorTestJSON-1201791476 tempest-ServersWithSpecificFlavorTestJSON-1201791476-project-member] Acquiring lock "12724be5-cfb1-4cf6-b98b-b4142da21714" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 659.083588] env[68906]: DEBUG oslo_concurrency.lockutils [None req-33e1c89b-3692-4ec9-abc0-4ab4dc3be0f7 tempest-ServersWithSpecificFlavorTestJSON-1201791476 tempest-ServersWithSpecificFlavorTestJSON-1201791476-project-member] Lock "12724be5-cfb1-4cf6-b98b-b4142da21714" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 663.550025] env[68906]: DEBUG oslo_concurrency.lockutils [None req-5cbdaf50-2b7b-401c-a522-7be104ee2090 tempest-ServersTestMultiNic-1243959320 tempest-ServersTestMultiNic-1243959320-project-member] Acquiring lock "b3bd0ecb-f329-48f3-b48b-25751262a5eb" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 663.550025] env[68906]: DEBUG oslo_concurrency.lockutils [None req-5cbdaf50-2b7b-401c-a522-7be104ee2090 tempest-ServersTestMultiNic-1243959320 tempest-ServersTestMultiNic-1243959320-project-member] Lock "b3bd0ecb-f329-48f3-b48b-25751262a5eb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 664.260453] env[68906]: DEBUG oslo_concurrency.lockutils [None req-76e436a4-02de-4d9d-8c6d-05a0643fcf58 tempest-ServersTestFqdnHostnames-1135812067 tempest-ServersTestFqdnHostnames-1135812067-project-member] Acquiring lock "e63fba5c-46fd-494c-9aec-dd76f12974d7" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 664.260691] env[68906]: DEBUG oslo_concurrency.lockutils [None req-76e436a4-02de-4d9d-8c6d-05a0643fcf58 tempest-ServersTestFqdnHostnames-1135812067 tempest-ServersTestFqdnHostnames-1135812067-project-member] Lock "e63fba5c-46fd-494c-9aec-dd76f12974d7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 664.261326] env[68906]: DEBUG oslo_concurrency.lockutils [None req-788bd7dd-0cd5-4bb7-8cd1-04d3a7d25d53 tempest-ListServerFiltersTestJSON-1324676598 tempest-ListServerFiltersTestJSON-1324676598-project-member] Acquiring lock "c02f41e2-8a99-4f18-9d86-82fa702bb2b6" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 664.261579] env[68906]: DEBUG oslo_concurrency.lockutils [None req-788bd7dd-0cd5-4bb7-8cd1-04d3a7d25d53 tempest-ListServerFiltersTestJSON-1324676598 tempest-ListServerFiltersTestJSON-1324676598-project-member] Lock "c02f41e2-8a99-4f18-9d86-82fa702bb2b6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 665.536290] env[68906]: DEBUG oslo_concurrency.lockutils [None req-d4746353-557b-4464-928b-810736c8a5e8 tempest-ServersNegativeTestJSON-345353645 tempest-ServersNegativeTestJSON-345353645-project-member] Acquiring lock "a2414623-7871-4706-81db-7d15ca74fdab" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 665.536562] env[68906]: DEBUG oslo_concurrency.lockutils [None req-d4746353-557b-4464-928b-810736c8a5e8 tempest-ServersNegativeTestJSON-345353645 tempest-ServersNegativeTestJSON-345353645-project-member] Lock "a2414623-7871-4706-81db-7d15ca74fdab" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 665.619277] env[68906]: DEBUG oslo_concurrency.lockutils [None req-cfcc6ec4-506b-49d6-b35d-2b745b5478c6 tempest-ServerShowV247Test-1864688297 tempest-ServerShowV247Test-1864688297-project-member] Acquiring lock "252028a3-3d3e-44c5-9c51-26752962a90d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 665.619567] env[68906]: DEBUG oslo_concurrency.lockutils [None req-cfcc6ec4-506b-49d6-b35d-2b745b5478c6 tempest-ServerShowV247Test-1864688297 tempest-ServerShowV247Test-1864688297-project-member] Lock "252028a3-3d3e-44c5-9c51-26752962a90d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 666.345931] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6ad0b13c-1d88-4d71-b729-8d7af234ed8c tempest-ListServerFiltersTestJSON-1324676598 tempest-ListServerFiltersTestJSON-1324676598-project-member] Acquiring lock "5ebd4d05-ddb3-4001-a526-a0c96b081818" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 666.346596] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6ad0b13c-1d88-4d71-b729-8d7af234ed8c tempest-ListServerFiltersTestJSON-1324676598 tempest-ListServerFiltersTestJSON-1324676598-project-member] Lock "5ebd4d05-ddb3-4001-a526-a0c96b081818" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 666.708739] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6779e842-3485-4ec7-8ae6-ff66c04d0527 tempest-ServerDiagnosticsTest-1865925156 tempest-ServerDiagnosticsTest-1865925156-project-member] Acquiring lock "c2e2265b-aef3-4a8c-ae03-314e679af64b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 666.709101] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6779e842-3485-4ec7-8ae6-ff66c04d0527 tempest-ServerDiagnosticsTest-1865925156 tempest-ServerDiagnosticsTest-1865925156-project-member] Lock "c2e2265b-aef3-4a8c-ae03-314e679af64b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 668.200992] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bcd4eaa1-350f-4523-a418-b08be646decd tempest-ListServerFiltersTestJSON-1324676598 tempest-ListServerFiltersTestJSON-1324676598-project-member] Acquiring lock "3ce9d4bd-3d7a-4191-9d3e-892efc81e8a2" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 668.201646] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bcd4eaa1-350f-4523-a418-b08be646decd tempest-ListServerFiltersTestJSON-1324676598 tempest-ListServerFiltersTestJSON-1324676598-project-member] Lock "3ce9d4bd-3d7a-4191-9d3e-892efc81e8a2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 669.036123] env[68906]: DEBUG oslo_concurrency.lockutils [None req-3732deea-6ab7-4cd8-9207-35e7d046cba4 tempest-ServerShowV247Test-1864688297 tempest-ServerShowV247Test-1864688297-project-member] Acquiring lock "3f26342e-89a8-4218-8875-8411eb8b16a0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 669.036787] env[68906]: DEBUG oslo_concurrency.lockutils [None req-3732deea-6ab7-4cd8-9207-35e7d046cba4 tempest-ServerShowV247Test-1864688297 tempest-ServerShowV247Test-1864688297-project-member] Lock "3f26342e-89a8-4218-8875-8411eb8b16a0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 670.586188] env[68906]: DEBUG oslo_concurrency.lockutils [None req-2ea042cd-dbdb-4f1c-ad8b-7f6455e1ae47 tempest-ServersTestBootFromVolume-70355936 tempest-ServersTestBootFromVolume-70355936-project-member] Acquiring lock "1fb9796e-e0d4-410d-bff1-a6a44b2a3580" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 670.586616] env[68906]: DEBUG oslo_concurrency.lockutils [None req-2ea042cd-dbdb-4f1c-ad8b-7f6455e1ae47 tempest-ServersTestBootFromVolume-70355936 tempest-ServersTestBootFromVolume-70355936-project-member] Lock "1fb9796e-e0d4-410d-bff1-a6a44b2a3580" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 671.660291] env[68906]: DEBUG oslo_concurrency.lockutils [None req-d75acd4f-807f-4ecd-9ae2-747ee1c4928b tempest-ServerAddressesNegativeTestJSON-1929406516 tempest-ServerAddressesNegativeTestJSON-1929406516-project-member] Acquiring lock "b77ff68e-350b-4f65-bb62-dfb727281e50" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 671.660291] env[68906]: DEBUG oslo_concurrency.lockutils [None req-d75acd4f-807f-4ecd-9ae2-747ee1c4928b tempest-ServerAddressesNegativeTestJSON-1929406516 tempest-ServerAddressesNegativeTestJSON-1929406516-project-member] Lock "b77ff68e-350b-4f65-bb62-dfb727281e50" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 673.236221] env[68906]: DEBUG oslo_concurrency.lockutils [None req-86da2890-c2a9-4c1a-8bb4-8828668edefb tempest-InstanceActionsTestJSON-874396935 tempest-InstanceActionsTestJSON-874396935-project-member] Acquiring lock "29e5aa99-4e20-4b6f-a749-544b8c41a713" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 673.236573] env[68906]: DEBUG oslo_concurrency.lockutils [None req-86da2890-c2a9-4c1a-8bb4-8828668edefb tempest-InstanceActionsTestJSON-874396935 tempest-InstanceActionsTestJSON-874396935-project-member] Lock "29e5aa99-4e20-4b6f-a749-544b8c41a713" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 673.717377] env[68906]: DEBUG oslo_concurrency.lockutils [None req-a0d2b26c-2e2f-42a2-b855-68741d9ae4e1 tempest-ServersAaction247Test-1755435843 tempest-ServersAaction247Test-1755435843-project-member] Acquiring lock "20fa65c1-9ea0-4dc2-828e-8477c9f45baa" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 673.717486] env[68906]: DEBUG oslo_concurrency.lockutils [None req-a0d2b26c-2e2f-42a2-b855-68741d9ae4e1 tempest-ServersAaction247Test-1755435843 tempest-ServersAaction247Test-1755435843-project-member] Lock "20fa65c1-9ea0-4dc2-828e-8477c9f45baa" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 680.479333] env[68906]: WARNING oslo_vmware.rw_handles [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 680.479333] env[68906]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 680.479333] env[68906]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 680.479333] env[68906]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 680.479333] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 680.479333] env[68906]: ERROR oslo_vmware.rw_handles response.begin() [ 680.479333] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 680.479333] env[68906]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 680.479333] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 680.479333] env[68906]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 680.479333] env[68906]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 680.479333] env[68906]: ERROR oslo_vmware.rw_handles [ 680.479933] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] [instance: 57feb127-36f1-403c-bbca-7054286c1972] Downloaded image file data b1400c31-d33b-4e13-944f-4c645e62493e to vmware_temp/984d8c72-4a3f-4db8-8df6-23da021ebc2f/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 680.481334] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] [instance: 57feb127-36f1-403c-bbca-7054286c1972] Caching image {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 680.481611] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Copying Virtual Disk [datastore2] vmware_temp/984d8c72-4a3f-4db8-8df6-23da021ebc2f/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk to [datastore2] vmware_temp/984d8c72-4a3f-4db8-8df6-23da021ebc2f/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk {{(pid=68906) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 680.481938] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-997ed2c2-bd27-4dd5-945f-dda0a8f197cd {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 680.491154] env[68906]: DEBUG oslo_vmware.api [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Waiting for the task: (returnval){ [ 680.491154] env[68906]: value = "task-3475293" [ 680.491154] env[68906]: _type = "Task" [ 680.491154] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 680.500965] env[68906]: DEBUG oslo_vmware.api [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Task: {'id': task-3475293, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 681.005948] env[68906]: DEBUG oslo_vmware.exceptions [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Fault InvalidArgument not matched. {{(pid=68906) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 681.006247] env[68906]: DEBUG oslo_concurrency.lockutils [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 681.009411] env[68906]: ERROR nova.compute.manager [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] [instance: 57feb127-36f1-403c-bbca-7054286c1972] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 681.009411] env[68906]: Faults: ['InvalidArgument'] [ 681.009411] env[68906]: ERROR nova.compute.manager [instance: 57feb127-36f1-403c-bbca-7054286c1972] Traceback (most recent call last): [ 681.009411] env[68906]: ERROR nova.compute.manager [instance: 57feb127-36f1-403c-bbca-7054286c1972] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 681.009411] env[68906]: ERROR nova.compute.manager [instance: 57feb127-36f1-403c-bbca-7054286c1972] yield resources [ 681.009411] env[68906]: ERROR nova.compute.manager [instance: 57feb127-36f1-403c-bbca-7054286c1972] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 681.009411] env[68906]: ERROR nova.compute.manager [instance: 57feb127-36f1-403c-bbca-7054286c1972] self.driver.spawn(context, instance, image_meta, [ 681.009411] env[68906]: ERROR nova.compute.manager [instance: 57feb127-36f1-403c-bbca-7054286c1972] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 681.009411] env[68906]: ERROR nova.compute.manager [instance: 57feb127-36f1-403c-bbca-7054286c1972] self._vmops.spawn(context, instance, image_meta, injected_files, [ 681.009411] env[68906]: ERROR nova.compute.manager [instance: 57feb127-36f1-403c-bbca-7054286c1972] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 681.009411] env[68906]: ERROR nova.compute.manager [instance: 57feb127-36f1-403c-bbca-7054286c1972] self._fetch_image_if_missing(context, vi) [ 681.009411] env[68906]: ERROR nova.compute.manager [instance: 57feb127-36f1-403c-bbca-7054286c1972] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 681.009955] env[68906]: ERROR nova.compute.manager [instance: 57feb127-36f1-403c-bbca-7054286c1972] image_cache(vi, tmp_image_ds_loc) [ 681.009955] env[68906]: ERROR nova.compute.manager [instance: 57feb127-36f1-403c-bbca-7054286c1972] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 681.009955] env[68906]: ERROR nova.compute.manager [instance: 57feb127-36f1-403c-bbca-7054286c1972] vm_util.copy_virtual_disk( [ 681.009955] env[68906]: ERROR nova.compute.manager [instance: 57feb127-36f1-403c-bbca-7054286c1972] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 681.009955] env[68906]: ERROR nova.compute.manager [instance: 57feb127-36f1-403c-bbca-7054286c1972] session._wait_for_task(vmdk_copy_task) [ 681.009955] env[68906]: ERROR nova.compute.manager [instance: 57feb127-36f1-403c-bbca-7054286c1972] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 681.009955] env[68906]: ERROR nova.compute.manager [instance: 57feb127-36f1-403c-bbca-7054286c1972] return self.wait_for_task(task_ref) [ 681.009955] env[68906]: ERROR nova.compute.manager [instance: 57feb127-36f1-403c-bbca-7054286c1972] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 681.009955] env[68906]: ERROR nova.compute.manager [instance: 57feb127-36f1-403c-bbca-7054286c1972] return evt.wait() [ 681.009955] env[68906]: ERROR nova.compute.manager [instance: 57feb127-36f1-403c-bbca-7054286c1972] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 681.009955] env[68906]: ERROR nova.compute.manager [instance: 57feb127-36f1-403c-bbca-7054286c1972] result = hub.switch() [ 681.009955] env[68906]: ERROR nova.compute.manager [instance: 57feb127-36f1-403c-bbca-7054286c1972] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 681.009955] env[68906]: ERROR nova.compute.manager [instance: 57feb127-36f1-403c-bbca-7054286c1972] return self.greenlet.switch() [ 681.010367] env[68906]: ERROR nova.compute.manager [instance: 57feb127-36f1-403c-bbca-7054286c1972] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 681.010367] env[68906]: ERROR nova.compute.manager [instance: 57feb127-36f1-403c-bbca-7054286c1972] self.f(*self.args, **self.kw) [ 681.010367] env[68906]: ERROR nova.compute.manager [instance: 57feb127-36f1-403c-bbca-7054286c1972] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 681.010367] env[68906]: ERROR nova.compute.manager [instance: 57feb127-36f1-403c-bbca-7054286c1972] raise exceptions.translate_fault(task_info.error) [ 681.010367] env[68906]: ERROR nova.compute.manager [instance: 57feb127-36f1-403c-bbca-7054286c1972] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 681.010367] env[68906]: ERROR nova.compute.manager [instance: 57feb127-36f1-403c-bbca-7054286c1972] Faults: ['InvalidArgument'] [ 681.010367] env[68906]: ERROR nova.compute.manager [instance: 57feb127-36f1-403c-bbca-7054286c1972] [ 681.010367] env[68906]: INFO nova.compute.manager [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] [instance: 57feb127-36f1-403c-bbca-7054286c1972] Terminating instance [ 681.011517] env[68906]: DEBUG oslo_concurrency.lockutils [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 681.012045] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 681.012459] env[68906]: DEBUG nova.compute.manager [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] [instance: 57feb127-36f1-403c-bbca-7054286c1972] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 681.012660] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] [instance: 57feb127-36f1-403c-bbca-7054286c1972] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 681.012897] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d2bb0c20-d972-4d0f-ae79-e2e9cb5014a8 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 681.016122] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1dd9e359-a604-4e4c-b0b5-3275161dcb31 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 681.027783] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] [instance: 57feb127-36f1-403c-bbca-7054286c1972] Unregistering the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 681.029488] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-6cdd5120-b1f4-4e58-a79a-5dfb891f5e4f {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 681.031872] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 681.032174] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68906) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 681.033041] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-edf9d6f7-c9f7-48d8-862d-a4d6cf59dc9e {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 681.041821] env[68906]: DEBUG oslo_vmware.api [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Waiting for the task: (returnval){ [ 681.041821] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]52b789d5-38b1-351d-6725-ef7dceb90ecc" [ 681.041821] env[68906]: _type = "Task" [ 681.041821] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 681.056905] env[68906]: DEBUG oslo_vmware.api [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]52b789d5-38b1-351d-6725-ef7dceb90ecc, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 681.125021] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] [instance: 57feb127-36f1-403c-bbca-7054286c1972] Unregistered the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 681.125021] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] [instance: 57feb127-36f1-403c-bbca-7054286c1972] Deleting contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 681.125021] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Deleting the datastore file [datastore2] 57feb127-36f1-403c-bbca-7054286c1972 {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 681.125021] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-6a66f265-1881-45b3-8110-54e03b9b79d4 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 681.132208] env[68906]: DEBUG oslo_vmware.api [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Waiting for the task: (returnval){ [ 681.132208] env[68906]: value = "task-3475295" [ 681.132208] env[68906]: _type = "Task" [ 681.132208] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 681.142228] env[68906]: DEBUG oslo_vmware.api [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Task: {'id': task-3475295, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 681.315832] env[68906]: DEBUG oslo_concurrency.lockutils [None req-0325c848-7458-4b40-a533-6b073c138188 tempest-FloatingIPsAssociationNegativeTestJSON-883500755 tempest-FloatingIPsAssociationNegativeTestJSON-883500755-project-member] Acquiring lock "98844da1-0e2a-46b5-8e72-c0f8dcd29b27" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 681.316080] env[68906]: DEBUG oslo_concurrency.lockutils [None req-0325c848-7458-4b40-a533-6b073c138188 tempest-FloatingIPsAssociationNegativeTestJSON-883500755 tempest-FloatingIPsAssociationNegativeTestJSON-883500755-project-member] Lock "98844da1-0e2a-46b5-8e72-c0f8dcd29b27" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 681.558709] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] Preparing fetch location {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 681.558981] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Creating directory with path [datastore2] vmware_temp/88c0448b-c20e-4947-b6c7-8e2c0ea94cd4/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 681.559508] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-da51e5ea-cfee-4011-9491-57fa96f0df2d {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 681.572412] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Created directory with path [datastore2] vmware_temp/88c0448b-c20e-4947-b6c7-8e2c0ea94cd4/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 681.572672] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] Fetch image to [datastore2] vmware_temp/88c0448b-c20e-4947-b6c7-8e2c0ea94cd4/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 681.572849] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to [datastore2] vmware_temp/88c0448b-c20e-4947-b6c7-8e2c0ea94cd4/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 681.573973] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-abdf430a-677c-46b2-b43b-acf8ef5c6442 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 681.582238] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32bb2de9-c0b1-438f-9639-9b6420407a5e {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 681.592761] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32e29619-29d3-4028-9e08-211158d991df {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 681.626020] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d9ea6f6d-dbac-4de2-96d4-71b2020ef4aa {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 681.635906] env[68906]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-06a24444-8588-406a-88de-f22feb993371 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 681.642363] env[68906]: DEBUG oslo_vmware.api [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Task: {'id': task-3475295, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.090008} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 681.642773] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Deleted the datastore file {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 681.642850] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] [instance: 57feb127-36f1-403c-bbca-7054286c1972] Deleted contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 681.642974] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] [instance: 57feb127-36f1-403c-bbca-7054286c1972] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 681.643436] env[68906]: INFO nova.compute.manager [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] [instance: 57feb127-36f1-403c-bbca-7054286c1972] Took 0.63 seconds to destroy the instance on the hypervisor. [ 681.645928] env[68906]: DEBUG nova.compute.claims [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] [instance: 57feb127-36f1-403c-bbca-7054286c1972] Aborting claim: {{(pid=68906) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 681.646136] env[68906]: DEBUG oslo_concurrency.lockutils [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 681.646354] env[68906]: DEBUG oslo_concurrency.lockutils [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 681.657732] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 681.721026] env[68906]: DEBUG oslo_vmware.rw_handles [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/88c0448b-c20e-4947-b6c7-8e2c0ea94cd4/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 681.787023] env[68906]: DEBUG oslo_vmware.rw_handles [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Completed reading data from the image iterator. {{(pid=68906) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 681.787023] env[68906]: DEBUG oslo_vmware.rw_handles [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/88c0448b-c20e-4947-b6c7-8e2c0ea94cd4/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 682.223641] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b5b6c96-8f88-4602-ba55-f6262ff2ec6d {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 682.236893] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1cf192f4-c858-4343-a54f-762247419e57 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 682.273706] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a8345bb3-927c-4979-b295-7a1040d0f77c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 682.281083] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-50484195-05f9-470f-9ecf-c0ae552799f5 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 682.296300] env[68906]: DEBUG nova.compute.provider_tree [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 682.311105] env[68906]: DEBUG nova.scheduler.client.report [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 682.331124] env[68906]: DEBUG oslo_concurrency.lockutils [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.684s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 682.331708] env[68906]: ERROR nova.compute.manager [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] [instance: 57feb127-36f1-403c-bbca-7054286c1972] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 682.331708] env[68906]: Faults: ['InvalidArgument'] [ 682.331708] env[68906]: ERROR nova.compute.manager [instance: 57feb127-36f1-403c-bbca-7054286c1972] Traceback (most recent call last): [ 682.331708] env[68906]: ERROR nova.compute.manager [instance: 57feb127-36f1-403c-bbca-7054286c1972] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 682.331708] env[68906]: ERROR nova.compute.manager [instance: 57feb127-36f1-403c-bbca-7054286c1972] self.driver.spawn(context, instance, image_meta, [ 682.331708] env[68906]: ERROR nova.compute.manager [instance: 57feb127-36f1-403c-bbca-7054286c1972] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 682.331708] env[68906]: ERROR nova.compute.manager [instance: 57feb127-36f1-403c-bbca-7054286c1972] self._vmops.spawn(context, instance, image_meta, injected_files, [ 682.331708] env[68906]: ERROR nova.compute.manager [instance: 57feb127-36f1-403c-bbca-7054286c1972] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 682.331708] env[68906]: ERROR nova.compute.manager [instance: 57feb127-36f1-403c-bbca-7054286c1972] self._fetch_image_if_missing(context, vi) [ 682.331708] env[68906]: ERROR nova.compute.manager [instance: 57feb127-36f1-403c-bbca-7054286c1972] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 682.331708] env[68906]: ERROR nova.compute.manager [instance: 57feb127-36f1-403c-bbca-7054286c1972] image_cache(vi, tmp_image_ds_loc) [ 682.331708] env[68906]: ERROR nova.compute.manager [instance: 57feb127-36f1-403c-bbca-7054286c1972] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 682.332087] env[68906]: ERROR nova.compute.manager [instance: 57feb127-36f1-403c-bbca-7054286c1972] vm_util.copy_virtual_disk( [ 682.332087] env[68906]: ERROR nova.compute.manager [instance: 57feb127-36f1-403c-bbca-7054286c1972] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 682.332087] env[68906]: ERROR nova.compute.manager [instance: 57feb127-36f1-403c-bbca-7054286c1972] session._wait_for_task(vmdk_copy_task) [ 682.332087] env[68906]: ERROR nova.compute.manager [instance: 57feb127-36f1-403c-bbca-7054286c1972] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 682.332087] env[68906]: ERROR nova.compute.manager [instance: 57feb127-36f1-403c-bbca-7054286c1972] return self.wait_for_task(task_ref) [ 682.332087] env[68906]: ERROR nova.compute.manager [instance: 57feb127-36f1-403c-bbca-7054286c1972] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 682.332087] env[68906]: ERROR nova.compute.manager [instance: 57feb127-36f1-403c-bbca-7054286c1972] return evt.wait() [ 682.332087] env[68906]: ERROR nova.compute.manager [instance: 57feb127-36f1-403c-bbca-7054286c1972] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 682.332087] env[68906]: ERROR nova.compute.manager [instance: 57feb127-36f1-403c-bbca-7054286c1972] result = hub.switch() [ 682.332087] env[68906]: ERROR nova.compute.manager [instance: 57feb127-36f1-403c-bbca-7054286c1972] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 682.332087] env[68906]: ERROR nova.compute.manager [instance: 57feb127-36f1-403c-bbca-7054286c1972] return self.greenlet.switch() [ 682.332087] env[68906]: ERROR nova.compute.manager [instance: 57feb127-36f1-403c-bbca-7054286c1972] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 682.332087] env[68906]: ERROR nova.compute.manager [instance: 57feb127-36f1-403c-bbca-7054286c1972] self.f(*self.args, **self.kw) [ 682.333831] env[68906]: ERROR nova.compute.manager [instance: 57feb127-36f1-403c-bbca-7054286c1972] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 682.333831] env[68906]: ERROR nova.compute.manager [instance: 57feb127-36f1-403c-bbca-7054286c1972] raise exceptions.translate_fault(task_info.error) [ 682.333831] env[68906]: ERROR nova.compute.manager [instance: 57feb127-36f1-403c-bbca-7054286c1972] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 682.333831] env[68906]: ERROR nova.compute.manager [instance: 57feb127-36f1-403c-bbca-7054286c1972] Faults: ['InvalidArgument'] [ 682.333831] env[68906]: ERROR nova.compute.manager [instance: 57feb127-36f1-403c-bbca-7054286c1972] [ 682.333831] env[68906]: DEBUG nova.compute.utils [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] [instance: 57feb127-36f1-403c-bbca-7054286c1972] VimFaultException {{(pid=68906) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 682.338215] env[68906]: DEBUG nova.compute.manager [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] [instance: 57feb127-36f1-403c-bbca-7054286c1972] Build of instance 57feb127-36f1-403c-bbca-7054286c1972 was re-scheduled: A specified parameter was not correct: fileType [ 682.338215] env[68906]: Faults: ['InvalidArgument'] {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 682.338662] env[68906]: DEBUG nova.compute.manager [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] [instance: 57feb127-36f1-403c-bbca-7054286c1972] Unplugging VIFs for instance {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 682.338859] env[68906]: DEBUG nova.compute.manager [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 682.339027] env[68906]: DEBUG nova.compute.manager [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] [instance: 57feb127-36f1-403c-bbca-7054286c1972] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 682.339210] env[68906]: DEBUG nova.network.neutron [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] [instance: 57feb127-36f1-403c-bbca-7054286c1972] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 682.993836] env[68906]: DEBUG nova.network.neutron [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] [instance: 57feb127-36f1-403c-bbca-7054286c1972] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 683.022444] env[68906]: INFO nova.compute.manager [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] [instance: 57feb127-36f1-403c-bbca-7054286c1972] Took 0.68 seconds to deallocate network for instance. [ 683.189228] env[68906]: INFO nova.scheduler.client.report [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Deleted allocations for instance 57feb127-36f1-403c-bbca-7054286c1972 [ 683.214120] env[68906]: DEBUG oslo_concurrency.lockutils [None req-1fdaad8b-4336-4044-9b9d-0b74397210fc tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Lock "57feb127-36f1-403c-bbca-7054286c1972" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 64.622s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 683.248028] env[68906]: DEBUG nova.compute.manager [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 683.309954] env[68906]: DEBUG oslo_concurrency.lockutils [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 683.310727] env[68906]: DEBUG oslo_concurrency.lockutils [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 683.313188] env[68906]: INFO nova.compute.claims [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 683.839165] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-636eba45-297e-4a22-a8df-1ffc78bafd32 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 683.848240] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d34f57c-fb3e-4466-a680-0fd1065d564a {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 683.880750] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d618953e-a04d-40a9-a424-018d9a2b4c91 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 683.888492] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-917da6f2-e071-4cb6-a96c-33aeb8116fee {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 683.903160] env[68906]: DEBUG nova.compute.provider_tree [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 683.913188] env[68906]: DEBUG nova.scheduler.client.report [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 683.935454] env[68906]: DEBUG oslo_concurrency.lockutils [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.623s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 683.935454] env[68906]: DEBUG nova.compute.manager [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Start building networks asynchronously for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 683.983445] env[68906]: DEBUG nova.compute.utils [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Using /dev/sd instead of None {{(pid=68906) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 683.984624] env[68906]: DEBUG nova.compute.manager [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Allocating IP information in the background. {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 683.984790] env[68906]: DEBUG nova.network.neutron [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] allocate_for_instance() {{(pid=68906) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 684.004026] env[68906]: DEBUG nova.compute.manager [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Start building block device mappings for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 684.092574] env[68906]: DEBUG nova.compute.manager [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Start spawning the instance on the hypervisor. {{(pid=68906) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 684.121647] env[68906]: DEBUG nova.virt.hardware [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T13:00:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T13:00:23Z,direct_url=,disk_format='vmdk',id=b1400c31-d33b-4e13-944f-4c645e62493e,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='1ae7bf3a375d41c6af5e7536af51ffd1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T13:00:24Z,virtual_size=,visibility=), allow threads: False {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 684.121908] env[68906]: DEBUG nova.virt.hardware [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Flavor limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 684.122530] env[68906]: DEBUG nova.virt.hardware [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Image limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 684.122530] env[68906]: DEBUG nova.virt.hardware [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Flavor pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 684.122530] env[68906]: DEBUG nova.virt.hardware [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Image pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 684.122530] env[68906]: DEBUG nova.virt.hardware [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 684.122763] env[68906]: DEBUG nova.virt.hardware [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 684.122882] env[68906]: DEBUG nova.virt.hardware [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 684.123167] env[68906]: DEBUG nova.virt.hardware [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Got 1 possible topologies {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 684.123400] env[68906]: DEBUG nova.virt.hardware [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 684.123628] env[68906]: DEBUG nova.virt.hardware [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 684.124712] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fcb71c95-63bc-4318-88e7-f0dafa7dd82a {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 684.133707] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-43aaa6e7-545c-4a12-b1d9-18daa0a9c810 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 684.331645] env[68906]: DEBUG nova.policy [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '03f15c38339249ae8d9c3ee929972f71', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ee86725d5efc478a9f54f29693407e36', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68906) authorize /opt/stack/nova/nova/policy.py:203}} [ 684.980051] env[68906]: DEBUG nova.network.neutron [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Successfully created port: 64934c11-0b20-4c85-b246-e6bccf212283 {{(pid=68906) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 686.136609] env[68906]: DEBUG oslo_concurrency.lockutils [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Acquiring lock "91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 686.136914] env[68906]: DEBUG oslo_concurrency.lockutils [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Lock "91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 686.417360] env[68906]: DEBUG nova.network.neutron [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Successfully updated port: 64934c11-0b20-4c85-b246-e6bccf212283 {{(pid=68906) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 686.426356] env[68906]: DEBUG oslo_concurrency.lockutils [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Acquiring lock "refresh_cache-9a2d2803-34b1-40f7-9349-e5734a217e18" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 686.426515] env[68906]: DEBUG oslo_concurrency.lockutils [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Acquired lock "refresh_cache-9a2d2803-34b1-40f7-9349-e5734a217e18" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 686.426696] env[68906]: DEBUG nova.network.neutron [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Building network info cache for instance {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 686.514175] env[68906]: DEBUG nova.network.neutron [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Instance cache missing network info. {{(pid=68906) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 686.843136] env[68906]: DEBUG nova.network.neutron [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Updating instance_info_cache with network_info: [{"id": "64934c11-0b20-4c85-b246-e6bccf212283", "address": "fa:16:3e:39:40:40", "network": {"id": "a7bc3138-4295-48d3-a21d-13d17b8a3bcb", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-335074880-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ee86725d5efc478a9f54f29693407e36", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0df968ae-c1ef-4009-a0f4-6f2e799c2fda", "external-id": "nsx-vlan-transportzone-864", "segmentation_id": 864, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap64934c11-0b", "ovs_interfaceid": "64934c11-0b20-4c85-b246-e6bccf212283", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 686.855256] env[68906]: DEBUG oslo_concurrency.lockutils [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Releasing lock "refresh_cache-9a2d2803-34b1-40f7-9349-e5734a217e18" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 686.855568] env[68906]: DEBUG nova.compute.manager [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Instance network_info: |[{"id": "64934c11-0b20-4c85-b246-e6bccf212283", "address": "fa:16:3e:39:40:40", "network": {"id": "a7bc3138-4295-48d3-a21d-13d17b8a3bcb", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-335074880-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ee86725d5efc478a9f54f29693407e36", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0df968ae-c1ef-4009-a0f4-6f2e799c2fda", "external-id": "nsx-vlan-transportzone-864", "segmentation_id": 864, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap64934c11-0b", "ovs_interfaceid": "64934c11-0b20-4c85-b246-e6bccf212283", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 686.855977] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:39:40:40', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '0df968ae-c1ef-4009-a0f4-6f2e799c2fda', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '64934c11-0b20-4c85-b246-e6bccf212283', 'vif_model': 'vmxnet3'}] {{(pid=68906) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 686.864787] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Creating folder: Project (ee86725d5efc478a9f54f29693407e36). Parent ref: group-v694750. {{(pid=68906) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 686.865421] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-6e826894-e19f-4a56-8144-5b377aab36fd {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 686.878434] env[68906]: INFO nova.virt.vmwareapi.vm_util [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Created folder: Project (ee86725d5efc478a9f54f29693407e36) in parent group-v694750. [ 686.878662] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Creating folder: Instances. Parent ref: group-v694785. {{(pid=68906) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 686.878912] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d4f5610c-da92-43ec-9cdc-f0981c58c48d {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 686.891601] env[68906]: INFO nova.virt.vmwareapi.vm_util [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Created folder: Instances in parent group-v694785. [ 686.891939] env[68906]: DEBUG oslo.service.loopingcall [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 686.892082] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Creating VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 686.892288] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-101b24f6-ad6e-4c3f-9186-8505043184cf {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 686.915447] env[68906]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 686.915447] env[68906]: value = "task-3475298" [ 686.915447] env[68906]: _type = "Task" [ 686.915447] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 686.923691] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475298, 'name': CreateVM_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 687.347324] env[68906]: DEBUG nova.compute.manager [req-6b334a5d-30bd-4601-928f-b3e4d4cf3cf2 req-3a6147ee-0624-4883-8951-3da92ff268e6 service nova] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Received event network-vif-plugged-64934c11-0b20-4c85-b246-e6bccf212283 {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 687.347324] env[68906]: DEBUG oslo_concurrency.lockutils [req-6b334a5d-30bd-4601-928f-b3e4d4cf3cf2 req-3a6147ee-0624-4883-8951-3da92ff268e6 service nova] Acquiring lock "9a2d2803-34b1-40f7-9349-e5734a217e18-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 687.348304] env[68906]: DEBUG oslo_concurrency.lockutils [req-6b334a5d-30bd-4601-928f-b3e4d4cf3cf2 req-3a6147ee-0624-4883-8951-3da92ff268e6 service nova] Lock "9a2d2803-34b1-40f7-9349-e5734a217e18-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 687.348716] env[68906]: DEBUG oslo_concurrency.lockutils [req-6b334a5d-30bd-4601-928f-b3e4d4cf3cf2 req-3a6147ee-0624-4883-8951-3da92ff268e6 service nova] Lock "9a2d2803-34b1-40f7-9349-e5734a217e18-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 687.349043] env[68906]: DEBUG nova.compute.manager [req-6b334a5d-30bd-4601-928f-b3e4d4cf3cf2 req-3a6147ee-0624-4883-8951-3da92ff268e6 service nova] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] No waiting events found dispatching network-vif-plugged-64934c11-0b20-4c85-b246-e6bccf212283 {{(pid=68906) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 687.349349] env[68906]: WARNING nova.compute.manager [req-6b334a5d-30bd-4601-928f-b3e4d4cf3cf2 req-3a6147ee-0624-4883-8951-3da92ff268e6 service nova] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Received unexpected event network-vif-plugged-64934c11-0b20-4c85-b246-e6bccf212283 for instance with vm_state building and task_state spawning. [ 687.428124] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475298, 'name': CreateVM_Task, 'duration_secs': 0.351914} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 687.428441] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Created VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 687.429379] env[68906]: DEBUG oslo_concurrency.lockutils [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 687.429720] env[68906]: DEBUG oslo_concurrency.lockutils [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 687.430646] env[68906]: DEBUG oslo_concurrency.lockutils [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 687.433029] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-95aa8820-9749-4fdb-bb9f-3802eeeddab7 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 687.437206] env[68906]: DEBUG oslo_vmware.api [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Waiting for the task: (returnval){ [ 687.437206] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]5240fad4-fb30-77ca-87e9-b004ecbe1411" [ 687.437206] env[68906]: _type = "Task" [ 687.437206] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 687.446831] env[68906]: DEBUG oslo_vmware.api [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]5240fad4-fb30-77ca-87e9-b004ecbe1411, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 687.951244] env[68906]: DEBUG oslo_concurrency.lockutils [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 687.951519] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Processing image b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 687.951729] env[68906]: DEBUG oslo_concurrency.lockutils [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 688.819493] env[68906]: DEBUG oslo_concurrency.lockutils [None req-dc466130-d812-4b39-b2cd-70750d4485d7 tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Acquiring lock "653c016d-c596-4f45-a18e-55f2d1935166" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 688.819855] env[68906]: DEBUG oslo_concurrency.lockutils [None req-dc466130-d812-4b39-b2cd-70750d4485d7 tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Lock "653c016d-c596-4f45-a18e-55f2d1935166" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 690.395215] env[68906]: DEBUG nova.compute.manager [req-c2d99adb-511d-4d0d-8c4c-adea444bfa2b req-6639d042-8006-4a6b-9448-c7639fc10061 service nova] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Received event network-changed-64934c11-0b20-4c85-b246-e6bccf212283 {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 690.395473] env[68906]: DEBUG nova.compute.manager [req-c2d99adb-511d-4d0d-8c4c-adea444bfa2b req-6639d042-8006-4a6b-9448-c7639fc10061 service nova] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Refreshing instance network info cache due to event network-changed-64934c11-0b20-4c85-b246-e6bccf212283. {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 690.395740] env[68906]: DEBUG oslo_concurrency.lockutils [req-c2d99adb-511d-4d0d-8c4c-adea444bfa2b req-6639d042-8006-4a6b-9448-c7639fc10061 service nova] Acquiring lock "refresh_cache-9a2d2803-34b1-40f7-9349-e5734a217e18" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 690.395919] env[68906]: DEBUG oslo_concurrency.lockutils [req-c2d99adb-511d-4d0d-8c4c-adea444bfa2b req-6639d042-8006-4a6b-9448-c7639fc10061 service nova] Acquired lock "refresh_cache-9a2d2803-34b1-40f7-9349-e5734a217e18" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 690.396126] env[68906]: DEBUG nova.network.neutron [req-c2d99adb-511d-4d0d-8c4c-adea444bfa2b req-6639d042-8006-4a6b-9448-c7639fc10061 service nova] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Refreshing network info cache for port 64934c11-0b20-4c85-b246-e6bccf212283 {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 690.664546] env[68906]: DEBUG oslo_concurrency.lockutils [None req-2c716e36-1965-4ec1-9f11-d2bddcd0b495 tempest-ServersTestJSON-364002111 tempest-ServersTestJSON-364002111-project-member] Acquiring lock "627c0227-72ca-4a77-aca1-bc3112955e7a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 690.664760] env[68906]: DEBUG oslo_concurrency.lockutils [None req-2c716e36-1965-4ec1-9f11-d2bddcd0b495 tempest-ServersTestJSON-364002111 tempest-ServersTestJSON-364002111-project-member] Lock "627c0227-72ca-4a77-aca1-bc3112955e7a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 690.882102] env[68906]: DEBUG nova.network.neutron [req-c2d99adb-511d-4d0d-8c4c-adea444bfa2b req-6639d042-8006-4a6b-9448-c7639fc10061 service nova] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Updated VIF entry in instance network info cache for port 64934c11-0b20-4c85-b246-e6bccf212283. {{(pid=68906) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 690.882409] env[68906]: DEBUG nova.network.neutron [req-c2d99adb-511d-4d0d-8c4c-adea444bfa2b req-6639d042-8006-4a6b-9448-c7639fc10061 service nova] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Updating instance_info_cache with network_info: [{"id": "64934c11-0b20-4c85-b246-e6bccf212283", "address": "fa:16:3e:39:40:40", "network": {"id": "a7bc3138-4295-48d3-a21d-13d17b8a3bcb", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-335074880-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ee86725d5efc478a9f54f29693407e36", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0df968ae-c1ef-4009-a0f4-6f2e799c2fda", "external-id": "nsx-vlan-transportzone-864", "segmentation_id": 864, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap64934c11-0b", "ovs_interfaceid": "64934c11-0b20-4c85-b246-e6bccf212283", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 690.891328] env[68906]: DEBUG oslo_concurrency.lockutils [req-c2d99adb-511d-4d0d-8c4c-adea444bfa2b req-6639d042-8006-4a6b-9448-c7639fc10061 service nova] Releasing lock "refresh_cache-9a2d2803-34b1-40f7-9349-e5734a217e18" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 692.546543] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 692.586598] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 693.140619] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 693.557261] env[68906]: DEBUG oslo_concurrency.lockutils [None req-639c9c99-7440-4229-8f59-1abf591f4d11 tempest-ServersAdmin275Test-949005569 tempest-ServersAdmin275Test-949005569-project-member] Acquiring lock "03e8dff3-b6b8-4754-8725-dddc9f9e6216" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 693.557578] env[68906]: DEBUG oslo_concurrency.lockutils [None req-639c9c99-7440-4229-8f59-1abf591f4d11 tempest-ServersAdmin275Test-949005569 tempest-ServersAdmin275Test-949005569-project-member] Lock "03e8dff3-b6b8-4754-8725-dddc9f9e6216" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 694.136205] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 694.139607] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 694.139807] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Starting heal instance info cache {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 694.139936] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Rebuilding the list of instances to heal {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 694.168694] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 694.168694] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 694.168694] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: d2258ded-478a-4530-b940-386286702048] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 694.168694] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: da0c4340-a657-43bd-9a98-4c8f50add720] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 694.168694] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 694.169127] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 694.169127] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 694.169127] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 694.169127] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 694.169127] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 694.169302] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Didn't find any instances for network info cache update. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 694.169302] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 694.169302] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 694.169302] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 694.169302] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 694.169302] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68906) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 694.169497] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager.update_available_resource {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 694.186961] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 694.187209] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 694.187415] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 694.187536] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68906) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 694.188625] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5176c1b7-cf25-46f3-82e1-8f04b822d30c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 694.198220] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c15b626-ddb6-4d98-8c26-d72ae28a3a1c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 694.214242] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e3c20045-2295-416b-b7d7-b6375b0d8063 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 694.219856] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e82cfec3-ca5e-4bd6-9b49-69430f753ee0 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 694.257085] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180934MB free_disk=93GB free_vcpus=48 pci_devices=None {{(pid=68906) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 694.257249] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 694.257454] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 694.365446] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance e2ee8d01-b1d3-4bde-81ae-668ffeef42b0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 694.365571] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 46481a4e-ac53-456d-b6cb-9f3ffbccf407 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 694.365713] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance d2258ded-478a-4530-b940-386286702048 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 694.365823] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance da0c4340-a657-43bd-9a98-4c8f50add720 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 694.365947] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 0540a4dc-1b86-4776-b633-f540af168a2b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 694.366077] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 4edb8b9f-b608-4be8-bfd3-65642710f9bd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 694.366196] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance d6ca51b9-b284-405c-878e-fdbc326b73e1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 694.366310] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance f42056e5-52cb-4d69-8022-ca643c49194e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 694.366422] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance ce63789a-1f0f-40ca-8368-ac3f84bb58cd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 694.398178] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 13eebe4e-5984-46c3-bb73-cd783ad45df6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 694.398351] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 9a2d2803-34b1-40f7-9349-e5734a217e18 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 694.429838] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance a7e0a28f-42a5-442e-b962-07771d2e6a27 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 694.443690] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance eb81e9b1-b573-4d7c-9ede-f8b32a43a201 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 694.456032] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 12724be5-cfb1-4cf6-b98b-b4142da21714 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 694.467037] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance b3bd0ecb-f329-48f3-b48b-25751262a5eb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 694.481365] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance e63fba5c-46fd-494c-9aec-dd76f12974d7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 694.492618] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance c02f41e2-8a99-4f18-9d86-82fa702bb2b6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 694.503096] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance a2414623-7871-4706-81db-7d15ca74fdab has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 694.515498] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 252028a3-3d3e-44c5-9c51-26752962a90d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 694.528234] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 5ebd4d05-ddb3-4001-a526-a0c96b081818 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 694.539683] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance c2e2265b-aef3-4a8c-ae03-314e679af64b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 694.550906] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 3ce9d4bd-3d7a-4191-9d3e-892efc81e8a2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 694.563285] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 3f26342e-89a8-4218-8875-8411eb8b16a0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 694.576189] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 1fb9796e-e0d4-410d-bff1-a6a44b2a3580 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 694.589193] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance b77ff68e-350b-4f65-bb62-dfb727281e50 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 694.599883] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 29e5aa99-4e20-4b6f-a749-544b8c41a713 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 694.610743] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 20fa65c1-9ea0-4dc2-828e-8477c9f45baa has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 694.621629] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 98844da1-0e2a-46b5-8e72-c0f8dcd29b27 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 694.632245] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 694.644050] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 653c016d-c596-4f45-a18e-55f2d1935166 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 694.654763] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 627c0227-72ca-4a77-aca1-bc3112955e7a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 694.666491] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 03e8dff3-b6b8-4754-8725-dddc9f9e6216 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 694.666808] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 694.666913] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 695.185026] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-53db22fa-40f2-47bf-8860-bdf577afcbfe {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 695.194017] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1eebcca6-9d79-4a77-8927-662622111c42 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 695.228261] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82c21137-a2dd-4d24-b26b-5dce2fc1f7f8 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 695.235931] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d774a1c1-774c-4dc0-bade-49edf5a96c5f {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 695.250762] env[68906]: DEBUG nova.compute.provider_tree [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 695.261202] env[68906]: DEBUG nova.scheduler.client.report [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 695.289247] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68906) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 695.289247] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.031s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 697.487293] env[68906]: DEBUG oslo_concurrency.lockutils [None req-dd87d48f-02f7-4c99-a534-1093df4a8f74 tempest-InstanceActionsNegativeTestJSON-1585889666 tempest-InstanceActionsNegativeTestJSON-1585889666-project-member] Acquiring lock "d3c5fdf4-a775-4b88-9bc2-ce9f31a9e6ac" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 697.487666] env[68906]: DEBUG oslo_concurrency.lockutils [None req-dd87d48f-02f7-4c99-a534-1093df4a8f74 tempest-InstanceActionsNegativeTestJSON-1585889666 tempest-InstanceActionsNegativeTestJSON-1585889666-project-member] Lock "d3c5fdf4-a775-4b88-9bc2-ce9f31a9e6ac" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 701.234107] env[68906]: DEBUG oslo_concurrency.lockutils [None req-41575b90-eab8-4bb8-a519-30f8f4618f78 tempest-ImagesOneServerNegativeTestJSON-982035189 tempest-ImagesOneServerNegativeTestJSON-982035189-project-member] Acquiring lock "242433e2-5b59-4b19-ba8d-80432ee4b7b7" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 701.234460] env[68906]: DEBUG oslo_concurrency.lockutils [None req-41575b90-eab8-4bb8-a519-30f8f4618f78 tempest-ImagesOneServerNegativeTestJSON-982035189 tempest-ImagesOneServerNegativeTestJSON-982035189-project-member] Lock "242433e2-5b59-4b19-ba8d-80432ee4b7b7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 730.097721] env[68906]: WARNING oslo_vmware.rw_handles [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 730.097721] env[68906]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 730.097721] env[68906]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 730.097721] env[68906]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 730.097721] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 730.097721] env[68906]: ERROR oslo_vmware.rw_handles response.begin() [ 730.097721] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 730.097721] env[68906]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 730.097721] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 730.097721] env[68906]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 730.097721] env[68906]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 730.097721] env[68906]: ERROR oslo_vmware.rw_handles [ 730.098242] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] Downloaded image file data b1400c31-d33b-4e13-944f-4c645e62493e to vmware_temp/88c0448b-c20e-4947-b6c7-8e2c0ea94cd4/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 730.100269] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] Caching image {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 730.100628] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Copying Virtual Disk [datastore2] vmware_temp/88c0448b-c20e-4947-b6c7-8e2c0ea94cd4/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk to [datastore2] vmware_temp/88c0448b-c20e-4947-b6c7-8e2c0ea94cd4/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk {{(pid=68906) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 730.100972] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-1cd893f4-f444-4905-829b-e7842e2af611 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 730.109021] env[68906]: DEBUG oslo_vmware.api [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Waiting for the task: (returnval){ [ 730.109021] env[68906]: value = "task-3475299" [ 730.109021] env[68906]: _type = "Task" [ 730.109021] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 730.116695] env[68906]: DEBUG oslo_vmware.api [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Task: {'id': task-3475299, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 730.620024] env[68906]: DEBUG oslo_vmware.exceptions [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Fault InvalidArgument not matched. {{(pid=68906) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 730.620323] env[68906]: DEBUG oslo_concurrency.lockutils [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 730.620904] env[68906]: ERROR nova.compute.manager [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 730.620904] env[68906]: Faults: ['InvalidArgument'] [ 730.620904] env[68906]: ERROR nova.compute.manager [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] Traceback (most recent call last): [ 730.620904] env[68906]: ERROR nova.compute.manager [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 730.620904] env[68906]: ERROR nova.compute.manager [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] yield resources [ 730.620904] env[68906]: ERROR nova.compute.manager [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 730.620904] env[68906]: ERROR nova.compute.manager [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] self.driver.spawn(context, instance, image_meta, [ 730.620904] env[68906]: ERROR nova.compute.manager [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 730.620904] env[68906]: ERROR nova.compute.manager [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 730.620904] env[68906]: ERROR nova.compute.manager [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 730.620904] env[68906]: ERROR nova.compute.manager [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] self._fetch_image_if_missing(context, vi) [ 730.620904] env[68906]: ERROR nova.compute.manager [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 730.621294] env[68906]: ERROR nova.compute.manager [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] image_cache(vi, tmp_image_ds_loc) [ 730.621294] env[68906]: ERROR nova.compute.manager [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 730.621294] env[68906]: ERROR nova.compute.manager [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] vm_util.copy_virtual_disk( [ 730.621294] env[68906]: ERROR nova.compute.manager [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 730.621294] env[68906]: ERROR nova.compute.manager [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] session._wait_for_task(vmdk_copy_task) [ 730.621294] env[68906]: ERROR nova.compute.manager [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 730.621294] env[68906]: ERROR nova.compute.manager [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] return self.wait_for_task(task_ref) [ 730.621294] env[68906]: ERROR nova.compute.manager [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 730.621294] env[68906]: ERROR nova.compute.manager [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] return evt.wait() [ 730.621294] env[68906]: ERROR nova.compute.manager [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 730.621294] env[68906]: ERROR nova.compute.manager [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] result = hub.switch() [ 730.621294] env[68906]: ERROR nova.compute.manager [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 730.621294] env[68906]: ERROR nova.compute.manager [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] return self.greenlet.switch() [ 730.621661] env[68906]: ERROR nova.compute.manager [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 730.621661] env[68906]: ERROR nova.compute.manager [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] self.f(*self.args, **self.kw) [ 730.621661] env[68906]: ERROR nova.compute.manager [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 730.621661] env[68906]: ERROR nova.compute.manager [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] raise exceptions.translate_fault(task_info.error) [ 730.621661] env[68906]: ERROR nova.compute.manager [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 730.621661] env[68906]: ERROR nova.compute.manager [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] Faults: ['InvalidArgument'] [ 730.621661] env[68906]: ERROR nova.compute.manager [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] [ 730.621661] env[68906]: INFO nova.compute.manager [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] Terminating instance [ 730.622827] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 730.623041] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 730.623667] env[68906]: DEBUG nova.compute.manager [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 730.623865] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 730.624106] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-42905989-67dd-492f-aeba-242d4555ec34 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 730.626430] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0625e9a5-461c-42b6-bbbb-8fd4ede614e9 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 730.633305] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] Unregistering the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 730.633539] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-1ed9119b-7a0e-4658-b3a8-6392f1076f44 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 730.635674] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 730.635846] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68906) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 730.636792] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4b6fdda7-c679-46e1-973c-dfb924985f6a {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 730.641586] env[68906]: DEBUG oslo_vmware.api [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Waiting for the task: (returnval){ [ 730.641586] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]52a985b7-e95c-8c14-3191-78bce9142b34" [ 730.641586] env[68906]: _type = "Task" [ 730.641586] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 730.648703] env[68906]: DEBUG oslo_vmware.api [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]52a985b7-e95c-8c14-3191-78bce9142b34, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 730.703508] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] Unregistered the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 730.703731] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] Deleting contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 730.703913] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Deleting the datastore file [datastore2] e2ee8d01-b1d3-4bde-81ae-668ffeef42b0 {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 730.704199] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-2388c11b-3eb3-414a-9ba3-916bf84d791a {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 730.710242] env[68906]: DEBUG oslo_vmware.api [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Waiting for the task: (returnval){ [ 730.710242] env[68906]: value = "task-3475301" [ 730.710242] env[68906]: _type = "Task" [ 730.710242] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 730.717935] env[68906]: DEBUG oslo_vmware.api [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Task: {'id': task-3475301, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 731.155681] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] Preparing fetch location {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 731.156111] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Creating directory with path [datastore2] vmware_temp/a94d1b07-a42f-4e08-bf12-1e88f7b2a7b8/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 731.156476] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-159071d9-003c-44c0-9ca7-332e974656b2 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 731.169873] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Created directory with path [datastore2] vmware_temp/a94d1b07-a42f-4e08-bf12-1e88f7b2a7b8/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 731.170090] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] Fetch image to [datastore2] vmware_temp/a94d1b07-a42f-4e08-bf12-1e88f7b2a7b8/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 731.170270] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to [datastore2] vmware_temp/a94d1b07-a42f-4e08-bf12-1e88f7b2a7b8/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 731.171068] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-40951da5-561b-498b-a2ad-1d571c7ae9a7 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 731.179923] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf76ca8b-2e08-413b-8827-b334c6875111 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 731.191491] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa00f7b3-5c66-4f24-8917-378e376ef383 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 731.225300] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e0945b8-da71-4314-9774-3b19b813ef2e {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 731.233743] env[68906]: DEBUG oslo_vmware.api [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Task: {'id': task-3475301, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.075366} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 731.234290] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Deleted the datastore file {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 731.234480] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] Deleted contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 731.234653] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 731.234847] env[68906]: INFO nova.compute.manager [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] Took 0.61 seconds to destroy the instance on the hypervisor. [ 731.236387] env[68906]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-403e8c26-a1cc-41c7-bd56-8702debbc1ab {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 731.238404] env[68906]: DEBUG nova.compute.claims [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] Aborting claim: {{(pid=68906) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 731.238817] env[68906]: DEBUG oslo_concurrency.lockutils [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 731.238817] env[68906]: DEBUG oslo_concurrency.lockutils [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 731.268910] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 731.321626] env[68906]: DEBUG oslo_vmware.rw_handles [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a94d1b07-a42f-4e08-bf12-1e88f7b2a7b8/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 731.382719] env[68906]: DEBUG oslo_vmware.rw_handles [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Completed reading data from the image iterator. {{(pid=68906) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 731.382979] env[68906]: DEBUG oslo_vmware.rw_handles [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a94d1b07-a42f-4e08-bf12-1e88f7b2a7b8/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 731.706627] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-319a9bb8-789f-452f-8ec5-04b8a16634ef {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 731.714104] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0748b92e-7451-4d6e-91ab-79027f0fc813 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 731.745533] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67969652-c021-4761-add8-71602fc3a1f4 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 731.752746] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb10e3fe-8eca-49aa-844f-5f10ae1718bc {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 731.765688] env[68906]: DEBUG nova.compute.provider_tree [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 731.774317] env[68906]: DEBUG nova.scheduler.client.report [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 731.786756] env[68906]: DEBUG oslo_concurrency.lockutils [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.548s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 731.787295] env[68906]: ERROR nova.compute.manager [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 731.787295] env[68906]: Faults: ['InvalidArgument'] [ 731.787295] env[68906]: ERROR nova.compute.manager [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] Traceback (most recent call last): [ 731.787295] env[68906]: ERROR nova.compute.manager [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 731.787295] env[68906]: ERROR nova.compute.manager [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] self.driver.spawn(context, instance, image_meta, [ 731.787295] env[68906]: ERROR nova.compute.manager [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 731.787295] env[68906]: ERROR nova.compute.manager [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 731.787295] env[68906]: ERROR nova.compute.manager [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 731.787295] env[68906]: ERROR nova.compute.manager [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] self._fetch_image_if_missing(context, vi) [ 731.787295] env[68906]: ERROR nova.compute.manager [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 731.787295] env[68906]: ERROR nova.compute.manager [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] image_cache(vi, tmp_image_ds_loc) [ 731.787295] env[68906]: ERROR nova.compute.manager [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 731.787631] env[68906]: ERROR nova.compute.manager [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] vm_util.copy_virtual_disk( [ 731.787631] env[68906]: ERROR nova.compute.manager [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 731.787631] env[68906]: ERROR nova.compute.manager [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] session._wait_for_task(vmdk_copy_task) [ 731.787631] env[68906]: ERROR nova.compute.manager [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 731.787631] env[68906]: ERROR nova.compute.manager [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] return self.wait_for_task(task_ref) [ 731.787631] env[68906]: ERROR nova.compute.manager [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 731.787631] env[68906]: ERROR nova.compute.manager [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] return evt.wait() [ 731.787631] env[68906]: ERROR nova.compute.manager [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 731.787631] env[68906]: ERROR nova.compute.manager [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] result = hub.switch() [ 731.787631] env[68906]: ERROR nova.compute.manager [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 731.787631] env[68906]: ERROR nova.compute.manager [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] return self.greenlet.switch() [ 731.787631] env[68906]: ERROR nova.compute.manager [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 731.787631] env[68906]: ERROR nova.compute.manager [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] self.f(*self.args, **self.kw) [ 731.787947] env[68906]: ERROR nova.compute.manager [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 731.787947] env[68906]: ERROR nova.compute.manager [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] raise exceptions.translate_fault(task_info.error) [ 731.787947] env[68906]: ERROR nova.compute.manager [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 731.787947] env[68906]: ERROR nova.compute.manager [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] Faults: ['InvalidArgument'] [ 731.787947] env[68906]: ERROR nova.compute.manager [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] [ 731.788096] env[68906]: DEBUG nova.compute.utils [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] VimFaultException {{(pid=68906) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 731.789611] env[68906]: DEBUG nova.compute.manager [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] Build of instance e2ee8d01-b1d3-4bde-81ae-668ffeef42b0 was re-scheduled: A specified parameter was not correct: fileType [ 731.789611] env[68906]: Faults: ['InvalidArgument'] {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 731.789978] env[68906]: DEBUG nova.compute.manager [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] Unplugging VIFs for instance {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 731.790159] env[68906]: DEBUG nova.compute.manager [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 731.790310] env[68906]: DEBUG nova.compute.manager [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 731.790473] env[68906]: DEBUG nova.network.neutron [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 732.298686] env[68906]: DEBUG nova.network.neutron [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 732.313286] env[68906]: INFO nova.compute.manager [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] [instance: e2ee8d01-b1d3-4bde-81ae-668ffeef42b0] Took 0.52 seconds to deallocate network for instance. [ 732.422294] env[68906]: INFO nova.scheduler.client.report [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Deleted allocations for instance e2ee8d01-b1d3-4bde-81ae-668ffeef42b0 [ 732.444050] env[68906]: DEBUG oslo_concurrency.lockutils [None req-b7790bd5-6745-473c-89fa-de780d438374 tempest-ServerExternalEventsTest-1430980438 tempest-ServerExternalEventsTest-1430980438-project-member] Lock "e2ee8d01-b1d3-4bde-81ae-668ffeef42b0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 111.586s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 732.456584] env[68906]: DEBUG nova.compute.manager [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 732.508567] env[68906]: DEBUG oslo_concurrency.lockutils [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 732.508965] env[68906]: DEBUG oslo_concurrency.lockutils [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 732.510719] env[68906]: INFO nova.compute.claims [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 732.928154] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eea5f2cf-974c-4900-8197-ea47a74713f4 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 732.936192] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f23aec95-67df-4f33-9fdf-96dc8d99c8e2 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 732.966511] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71e45759-4974-4ed3-842c-64f0115b8e7e {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 732.973954] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c949864-d0b3-4f11-b548-7ebea60aeb01 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 732.986804] env[68906]: DEBUG nova.compute.provider_tree [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 732.995790] env[68906]: DEBUG nova.scheduler.client.report [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 733.015060] env[68906]: DEBUG oslo_concurrency.lockutils [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.506s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 733.015599] env[68906]: DEBUG nova.compute.manager [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Start building networks asynchronously for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 733.057585] env[68906]: DEBUG nova.compute.utils [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Using /dev/sd instead of None {{(pid=68906) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 733.058893] env[68906]: DEBUG nova.compute.manager [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Allocating IP information in the background. {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 733.059234] env[68906]: DEBUG nova.network.neutron [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] allocate_for_instance() {{(pid=68906) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 733.068030] env[68906]: DEBUG nova.compute.manager [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Start building block device mappings for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 733.133879] env[68906]: DEBUG nova.compute.manager [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Start spawning the instance on the hypervisor. {{(pid=68906) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 733.158459] env[68906]: DEBUG nova.virt.hardware [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T13:00:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T13:00:23Z,direct_url=,disk_format='vmdk',id=b1400c31-d33b-4e13-944f-4c645e62493e,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='1ae7bf3a375d41c6af5e7536af51ffd1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T13:00:24Z,virtual_size=,visibility=), allow threads: False {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 733.158712] env[68906]: DEBUG nova.virt.hardware [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Flavor limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 733.158871] env[68906]: DEBUG nova.virt.hardware [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Image limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 733.159066] env[68906]: DEBUG nova.virt.hardware [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Flavor pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 733.159218] env[68906]: DEBUG nova.virt.hardware [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Image pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 733.159366] env[68906]: DEBUG nova.virt.hardware [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 733.159601] env[68906]: DEBUG nova.virt.hardware [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 733.159768] env[68906]: DEBUG nova.virt.hardware [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 733.159938] env[68906]: DEBUG nova.virt.hardware [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Got 1 possible topologies {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 733.160116] env[68906]: DEBUG nova.virt.hardware [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 733.160295] env[68906]: DEBUG nova.virt.hardware [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 733.161175] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-157b0f52-a5bb-48a4-8192-661f16cfd8af {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 733.169119] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e59297a9-9152-4950-9fff-e4cf851f8598 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 733.314850] env[68906]: DEBUG nova.policy [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7e8b8fc273be4fa49144f70d1b1b2a3a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3d3cc4c86bc14a69a001ef23df615f2c', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68906) authorize /opt/stack/nova/nova/policy.py:203}} [ 733.906273] env[68906]: DEBUG nova.network.neutron [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Successfully created port: 0f1da10a-6b69-4953-819a-138ce9aa9a24 {{(pid=68906) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 735.006148] env[68906]: DEBUG nova.network.neutron [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Successfully updated port: 0f1da10a-6b69-4953-819a-138ce9aa9a24 {{(pid=68906) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 735.028400] env[68906]: DEBUG oslo_concurrency.lockutils [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Acquiring lock "refresh_cache-13eebe4e-5984-46c3-bb73-cd783ad45df6" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 735.028400] env[68906]: DEBUG oslo_concurrency.lockutils [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Acquired lock "refresh_cache-13eebe4e-5984-46c3-bb73-cd783ad45df6" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 735.028400] env[68906]: DEBUG nova.network.neutron [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Building network info cache for instance {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 735.113538] env[68906]: DEBUG nova.network.neutron [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Instance cache missing network info. {{(pid=68906) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 735.180946] env[68906]: DEBUG nova.compute.manager [req-7a4b9e5b-60b3-44ac-a65d-271902b1db3d req-46ded392-be56-4adf-9db0-9dce284d8696 service nova] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Received event network-vif-plugged-0f1da10a-6b69-4953-819a-138ce9aa9a24 {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 735.183488] env[68906]: DEBUG oslo_concurrency.lockutils [req-7a4b9e5b-60b3-44ac-a65d-271902b1db3d req-46ded392-be56-4adf-9db0-9dce284d8696 service nova] Acquiring lock "13eebe4e-5984-46c3-bb73-cd783ad45df6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 735.183718] env[68906]: DEBUG oslo_concurrency.lockutils [req-7a4b9e5b-60b3-44ac-a65d-271902b1db3d req-46ded392-be56-4adf-9db0-9dce284d8696 service nova] Lock "13eebe4e-5984-46c3-bb73-cd783ad45df6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.003s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 735.183892] env[68906]: DEBUG oslo_concurrency.lockutils [req-7a4b9e5b-60b3-44ac-a65d-271902b1db3d req-46ded392-be56-4adf-9db0-9dce284d8696 service nova] Lock "13eebe4e-5984-46c3-bb73-cd783ad45df6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 735.185145] env[68906]: DEBUG nova.compute.manager [req-7a4b9e5b-60b3-44ac-a65d-271902b1db3d req-46ded392-be56-4adf-9db0-9dce284d8696 service nova] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] No waiting events found dispatching network-vif-plugged-0f1da10a-6b69-4953-819a-138ce9aa9a24 {{(pid=68906) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 735.185361] env[68906]: WARNING nova.compute.manager [req-7a4b9e5b-60b3-44ac-a65d-271902b1db3d req-46ded392-be56-4adf-9db0-9dce284d8696 service nova] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Received unexpected event network-vif-plugged-0f1da10a-6b69-4953-819a-138ce9aa9a24 for instance with vm_state building and task_state spawning. [ 735.582958] env[68906]: DEBUG nova.network.neutron [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Updating instance_info_cache with network_info: [{"id": "0f1da10a-6b69-4953-819a-138ce9aa9a24", "address": "fa:16:3e:a7:b0:77", "network": {"id": "a9fd09ac-36e9-4c8d-83bd-4e2704c839d6", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1118120170-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3d3cc4c86bc14a69a001ef23df615f2c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "40c947c4-f471-4d48-8e43-fee54198107e", "external-id": "nsx-vlan-transportzone-203", "segmentation_id": 203, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0f1da10a-6b", "ovs_interfaceid": "0f1da10a-6b69-4953-819a-138ce9aa9a24", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 735.598435] env[68906]: DEBUG oslo_concurrency.lockutils [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Releasing lock "refresh_cache-13eebe4e-5984-46c3-bb73-cd783ad45df6" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 735.598731] env[68906]: DEBUG nova.compute.manager [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Instance network_info: |[{"id": "0f1da10a-6b69-4953-819a-138ce9aa9a24", "address": "fa:16:3e:a7:b0:77", "network": {"id": "a9fd09ac-36e9-4c8d-83bd-4e2704c839d6", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1118120170-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3d3cc4c86bc14a69a001ef23df615f2c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "40c947c4-f471-4d48-8e43-fee54198107e", "external-id": "nsx-vlan-transportzone-203", "segmentation_id": 203, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0f1da10a-6b", "ovs_interfaceid": "0f1da10a-6b69-4953-819a-138ce9aa9a24", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 735.599126] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:a7:b0:77', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '40c947c4-f471-4d48-8e43-fee54198107e', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '0f1da10a-6b69-4953-819a-138ce9aa9a24', 'vif_model': 'vmxnet3'}] {{(pid=68906) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 735.606676] env[68906]: DEBUG oslo.service.loopingcall [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 735.607252] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Creating VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 735.607384] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-9274f365-d28e-47b8-85ec-aacd4fcd5d56 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 735.628095] env[68906]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 735.628095] env[68906]: value = "task-3475302" [ 735.628095] env[68906]: _type = "Task" [ 735.628095] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 735.636698] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475302, 'name': CreateVM_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 736.138696] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475302, 'name': CreateVM_Task, 'duration_secs': 0.294885} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 736.138965] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Created VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 736.139608] env[68906]: DEBUG oslo_concurrency.lockutils [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 736.139731] env[68906]: DEBUG oslo_concurrency.lockutils [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 736.140077] env[68906]: DEBUG oslo_concurrency.lockutils [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 736.140311] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2681dc60-5093-4e4c-960b-0b714e52449d {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 736.144982] env[68906]: DEBUG oslo_vmware.api [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Waiting for the task: (returnval){ [ 736.144982] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]52361477-2475-efcb-14ea-dec2c3c3259c" [ 736.144982] env[68906]: _type = "Task" [ 736.144982] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 736.152365] env[68906]: DEBUG oslo_vmware.api [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]52361477-2475-efcb-14ea-dec2c3c3259c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 736.655231] env[68906]: DEBUG oslo_concurrency.lockutils [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 736.655579] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Processing image b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 736.655691] env[68906]: DEBUG oslo_concurrency.lockutils [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 737.248466] env[68906]: DEBUG nova.compute.manager [req-d9dee730-c63b-467f-9149-f6fe896e5f01 req-b1a13559-7589-48b1-9cc6-3f3d1c1c3313 service nova] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Received event network-changed-0f1da10a-6b69-4953-819a-138ce9aa9a24 {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 737.248725] env[68906]: DEBUG nova.compute.manager [req-d9dee730-c63b-467f-9149-f6fe896e5f01 req-b1a13559-7589-48b1-9cc6-3f3d1c1c3313 service nova] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Refreshing instance network info cache due to event network-changed-0f1da10a-6b69-4953-819a-138ce9aa9a24. {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 737.248877] env[68906]: DEBUG oslo_concurrency.lockutils [req-d9dee730-c63b-467f-9149-f6fe896e5f01 req-b1a13559-7589-48b1-9cc6-3f3d1c1c3313 service nova] Acquiring lock "refresh_cache-13eebe4e-5984-46c3-bb73-cd783ad45df6" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 737.249033] env[68906]: DEBUG oslo_concurrency.lockutils [req-d9dee730-c63b-467f-9149-f6fe896e5f01 req-b1a13559-7589-48b1-9cc6-3f3d1c1c3313 service nova] Acquired lock "refresh_cache-13eebe4e-5984-46c3-bb73-cd783ad45df6" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 737.249207] env[68906]: DEBUG nova.network.neutron [req-d9dee730-c63b-467f-9149-f6fe896e5f01 req-b1a13559-7589-48b1-9cc6-3f3d1c1c3313 service nova] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Refreshing network info cache for port 0f1da10a-6b69-4953-819a-138ce9aa9a24 {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 737.718391] env[68906]: DEBUG nova.network.neutron [req-d9dee730-c63b-467f-9149-f6fe896e5f01 req-b1a13559-7589-48b1-9cc6-3f3d1c1c3313 service nova] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Updated VIF entry in instance network info cache for port 0f1da10a-6b69-4953-819a-138ce9aa9a24. {{(pid=68906) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 737.718727] env[68906]: DEBUG nova.network.neutron [req-d9dee730-c63b-467f-9149-f6fe896e5f01 req-b1a13559-7589-48b1-9cc6-3f3d1c1c3313 service nova] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Updating instance_info_cache with network_info: [{"id": "0f1da10a-6b69-4953-819a-138ce9aa9a24", "address": "fa:16:3e:a7:b0:77", "network": {"id": "a9fd09ac-36e9-4c8d-83bd-4e2704c839d6", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-1118120170-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3d3cc4c86bc14a69a001ef23df615f2c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "40c947c4-f471-4d48-8e43-fee54198107e", "external-id": "nsx-vlan-transportzone-203", "segmentation_id": 203, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0f1da10a-6b", "ovs_interfaceid": "0f1da10a-6b69-4953-819a-138ce9aa9a24", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 737.728092] env[68906]: DEBUG oslo_concurrency.lockutils [req-d9dee730-c63b-467f-9149-f6fe896e5f01 req-b1a13559-7589-48b1-9cc6-3f3d1c1c3313 service nova] Releasing lock "refresh_cache-13eebe4e-5984-46c3-bb73-cd783ad45df6" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 742.351585] env[68906]: DEBUG oslo_concurrency.lockutils [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Acquiring lock "acc11633-a489-4d8f-ad76-f17049a91545" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 742.351912] env[68906]: DEBUG oslo_concurrency.lockutils [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Lock "acc11633-a489-4d8f-ad76-f17049a91545" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 742.705704] env[68906]: DEBUG oslo_concurrency.lockutils [None req-ae8fe18e-f486-42a3-9628-3d30cfec0923 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Acquiring lock "0874bf05-e156-404e-a067-869e370fd14b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 742.705944] env[68906]: DEBUG oslo_concurrency.lockutils [None req-ae8fe18e-f486-42a3-9628-3d30cfec0923 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Lock "0874bf05-e156-404e-a067-869e370fd14b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 754.260067] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 754.260384] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 754.260486] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Starting heal instance info cache {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 754.260605] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Rebuilding the list of instances to heal {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 754.282790] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 754.283031] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: d2258ded-478a-4530-b940-386286702048] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 754.283195] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: da0c4340-a657-43bd-9a98-4c8f50add720] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 754.283328] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 754.283453] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 754.283579] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 754.283702] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 754.283823] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 754.283942] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 754.284119] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 754.284195] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Didn't find any instances for network info cache update. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 754.284665] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 754.284835] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 755.140256] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 755.140497] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 755.140645] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68906) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 756.140525] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 756.140800] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 756.140800] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager.update_available_resource {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 756.153568] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 756.153568] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 756.153708] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 756.153843] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68906) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 756.155524] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-235c74e8-0d1e-4801-8273-826d095e5c96 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 756.164223] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9460cc88-8992-4004-a969-4e7124ed5254 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 756.178377] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d8a57818-4e00-430b-9743-acc656b1375a {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 756.184478] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bfda473b-5cd9-4a6a-bc92-dbcb87088a80 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 756.213549] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180933MB free_disk=93GB free_vcpus=48 pci_devices=None {{(pid=68906) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 756.213653] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 756.213858] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 756.292156] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 46481a4e-ac53-456d-b6cb-9f3ffbccf407 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 756.292327] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance d2258ded-478a-4530-b940-386286702048 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 756.292459] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance da0c4340-a657-43bd-9a98-4c8f50add720 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 756.292589] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 0540a4dc-1b86-4776-b633-f540af168a2b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 756.292702] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 4edb8b9f-b608-4be8-bfd3-65642710f9bd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 756.292820] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance d6ca51b9-b284-405c-878e-fdbc326b73e1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 756.292936] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance f42056e5-52cb-4d69-8022-ca643c49194e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 756.293063] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance ce63789a-1f0f-40ca-8368-ac3f84bb58cd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 756.293215] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 13eebe4e-5984-46c3-bb73-cd783ad45df6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 756.293284] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 9a2d2803-34b1-40f7-9349-e5734a217e18 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 756.304372] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance a7e0a28f-42a5-442e-b962-07771d2e6a27 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 756.315428] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance eb81e9b1-b573-4d7c-9ede-f8b32a43a201 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 756.325602] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 12724be5-cfb1-4cf6-b98b-b4142da21714 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 756.335461] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance b3bd0ecb-f329-48f3-b48b-25751262a5eb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 756.345167] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance e63fba5c-46fd-494c-9aec-dd76f12974d7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 756.355162] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance c02f41e2-8a99-4f18-9d86-82fa702bb2b6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 756.364232] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance a2414623-7871-4706-81db-7d15ca74fdab has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 756.373711] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 252028a3-3d3e-44c5-9c51-26752962a90d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 756.382876] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 5ebd4d05-ddb3-4001-a526-a0c96b081818 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 756.392019] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance c2e2265b-aef3-4a8c-ae03-314e679af64b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 756.401298] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 3ce9d4bd-3d7a-4191-9d3e-892efc81e8a2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 756.411507] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 3f26342e-89a8-4218-8875-8411eb8b16a0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 756.421829] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 1fb9796e-e0d4-410d-bff1-a6a44b2a3580 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 756.430849] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance b77ff68e-350b-4f65-bb62-dfb727281e50 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 756.440323] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 29e5aa99-4e20-4b6f-a749-544b8c41a713 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 756.450025] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 20fa65c1-9ea0-4dc2-828e-8477c9f45baa has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 756.459078] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 98844da1-0e2a-46b5-8e72-c0f8dcd29b27 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 756.468996] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 756.478145] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 653c016d-c596-4f45-a18e-55f2d1935166 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 756.488088] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 627c0227-72ca-4a77-aca1-bc3112955e7a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 756.498146] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 03e8dff3-b6b8-4754-8725-dddc9f9e6216 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 756.507368] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance d3c5fdf4-a775-4b88-9bc2-ce9f31a9e6ac has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 756.516856] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 242433e2-5b59-4b19-ba8d-80432ee4b7b7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 756.526747] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance acc11633-a489-4d8f-ad76-f17049a91545 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 756.536450] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 0874bf05-e156-404e-a067-869e370fd14b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 756.536724] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 756.536876] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 756.913145] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-764d87f3-616f-401c-ab7f-ad4a98f2cbb5 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 756.920449] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da04d670-ee6b-4fb6-905f-abfc592f5ec8 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 756.950492] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ab19eae-fedc-4576-8b3f-04b98fc0ff87 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 756.957351] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-323559fa-cb1f-4d09-9236-ca9e20ef7bb3 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 756.970217] env[68906]: DEBUG nova.compute.provider_tree [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 756.980322] env[68906]: DEBUG nova.scheduler.client.report [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 756.994528] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68906) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 756.994718] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.781s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 776.766802] env[68906]: WARNING oslo_vmware.rw_handles [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 776.766802] env[68906]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 776.766802] env[68906]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 776.766802] env[68906]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 776.766802] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 776.766802] env[68906]: ERROR oslo_vmware.rw_handles response.begin() [ 776.766802] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 776.766802] env[68906]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 776.766802] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 776.766802] env[68906]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 776.766802] env[68906]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 776.766802] env[68906]: ERROR oslo_vmware.rw_handles [ 776.767415] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] Downloaded image file data b1400c31-d33b-4e13-944f-4c645e62493e to vmware_temp/a94d1b07-a42f-4e08-bf12-1e88f7b2a7b8/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 776.768851] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] Caching image {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 776.769143] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Copying Virtual Disk [datastore2] vmware_temp/a94d1b07-a42f-4e08-bf12-1e88f7b2a7b8/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk to [datastore2] vmware_temp/a94d1b07-a42f-4e08-bf12-1e88f7b2a7b8/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk {{(pid=68906) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 776.769426] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-1cc36ce0-55a6-441a-8029-abd1017755ab {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 776.778068] env[68906]: DEBUG oslo_vmware.api [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Waiting for the task: (returnval){ [ 776.778068] env[68906]: value = "task-3475303" [ 776.778068] env[68906]: _type = "Task" [ 776.778068] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 776.785731] env[68906]: DEBUG oslo_vmware.api [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Task: {'id': task-3475303, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 777.289329] env[68906]: DEBUG oslo_vmware.exceptions [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Fault InvalidArgument not matched. {{(pid=68906) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 777.289619] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 777.290212] env[68906]: ERROR nova.compute.manager [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 777.290212] env[68906]: Faults: ['InvalidArgument'] [ 777.290212] env[68906]: ERROR nova.compute.manager [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] Traceback (most recent call last): [ 777.290212] env[68906]: ERROR nova.compute.manager [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 777.290212] env[68906]: ERROR nova.compute.manager [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] yield resources [ 777.290212] env[68906]: ERROR nova.compute.manager [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 777.290212] env[68906]: ERROR nova.compute.manager [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] self.driver.spawn(context, instance, image_meta, [ 777.290212] env[68906]: ERROR nova.compute.manager [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 777.290212] env[68906]: ERROR nova.compute.manager [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] self._vmops.spawn(context, instance, image_meta, injected_files, [ 777.290212] env[68906]: ERROR nova.compute.manager [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 777.290212] env[68906]: ERROR nova.compute.manager [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] self._fetch_image_if_missing(context, vi) [ 777.290212] env[68906]: ERROR nova.compute.manager [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 777.290565] env[68906]: ERROR nova.compute.manager [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] image_cache(vi, tmp_image_ds_loc) [ 777.290565] env[68906]: ERROR nova.compute.manager [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 777.290565] env[68906]: ERROR nova.compute.manager [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] vm_util.copy_virtual_disk( [ 777.290565] env[68906]: ERROR nova.compute.manager [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 777.290565] env[68906]: ERROR nova.compute.manager [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] session._wait_for_task(vmdk_copy_task) [ 777.290565] env[68906]: ERROR nova.compute.manager [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 777.290565] env[68906]: ERROR nova.compute.manager [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] return self.wait_for_task(task_ref) [ 777.290565] env[68906]: ERROR nova.compute.manager [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 777.290565] env[68906]: ERROR nova.compute.manager [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] return evt.wait() [ 777.290565] env[68906]: ERROR nova.compute.manager [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 777.290565] env[68906]: ERROR nova.compute.manager [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] result = hub.switch() [ 777.290565] env[68906]: ERROR nova.compute.manager [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 777.290565] env[68906]: ERROR nova.compute.manager [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] return self.greenlet.switch() [ 777.291081] env[68906]: ERROR nova.compute.manager [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 777.291081] env[68906]: ERROR nova.compute.manager [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] self.f(*self.args, **self.kw) [ 777.291081] env[68906]: ERROR nova.compute.manager [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 777.291081] env[68906]: ERROR nova.compute.manager [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] raise exceptions.translate_fault(task_info.error) [ 777.291081] env[68906]: ERROR nova.compute.manager [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 777.291081] env[68906]: ERROR nova.compute.manager [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] Faults: ['InvalidArgument'] [ 777.291081] env[68906]: ERROR nova.compute.manager [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] [ 777.291081] env[68906]: INFO nova.compute.manager [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] Terminating instance [ 777.292138] env[68906]: DEBUG oslo_concurrency.lockutils [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 777.292347] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 777.292583] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6380c9e0-96e3-4856-872a-78eff084b90e {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 777.294876] env[68906]: DEBUG nova.compute.manager [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 777.295087] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 777.295846] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93721a67-c2a1-4ba7-bfc3-51f2a6468c89 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 777.303093] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] Unregistering the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 777.303352] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-71f583f7-8ada-4453-9a13-76a43ec68fab {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 777.305628] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 777.305805] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68906) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 777.306738] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4922c912-6b5c-46c5-a9d6-63f91b21f7a4 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 777.311485] env[68906]: DEBUG oslo_vmware.api [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Waiting for the task: (returnval){ [ 777.311485] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]52cac51f-f32e-e829-55d2-3d75ba1f9a9e" [ 777.311485] env[68906]: _type = "Task" [ 777.311485] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 777.319105] env[68906]: DEBUG oslo_vmware.api [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]52cac51f-f32e-e829-55d2-3d75ba1f9a9e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 777.587331] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] Unregistered the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 777.587575] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] Deleting contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 777.587782] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Deleting the datastore file [datastore2] 46481a4e-ac53-456d-b6cb-9f3ffbccf407 {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 777.588015] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-82e1b314-6dcd-4b87-bd29-a848ddc94d83 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 777.595540] env[68906]: DEBUG oslo_vmware.api [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Waiting for the task: (returnval){ [ 777.595540] env[68906]: value = "task-3475305" [ 777.595540] env[68906]: _type = "Task" [ 777.595540] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 777.604188] env[68906]: DEBUG oslo_vmware.api [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Task: {'id': task-3475305, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 777.822076] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] [instance: d2258ded-478a-4530-b940-386286702048] Preparing fetch location {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 777.822386] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Creating directory with path [datastore2] vmware_temp/d348da25-6c16-4330-9645-fa4c032c0ae8/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 777.822581] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4a328e88-b4f1-4549-b9d7-a777dbf8b9b6 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 777.841674] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Created directory with path [datastore2] vmware_temp/d348da25-6c16-4330-9645-fa4c032c0ae8/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 777.841862] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] [instance: d2258ded-478a-4530-b940-386286702048] Fetch image to [datastore2] vmware_temp/d348da25-6c16-4330-9645-fa4c032c0ae8/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 777.842047] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] [instance: d2258ded-478a-4530-b940-386286702048] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to [datastore2] vmware_temp/d348da25-6c16-4330-9645-fa4c032c0ae8/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 777.842776] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-17da4b1d-7019-421b-9897-4842f743df80 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 777.849250] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-83d5f01e-6cc2-4ce1-b144-467602959554 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 777.858992] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c6a68e62-5114-45fe-97c5-71ae5aec4625 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 777.889544] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6dfeeeb-2275-46f6-9f67-6c91acd1fca9 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 777.895591] env[68906]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-3ad7eff6-d7be-41f5-952c-3a27162a211b {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 777.925775] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] [instance: d2258ded-478a-4530-b940-386286702048] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 777.980415] env[68906]: DEBUG oslo_vmware.rw_handles [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d348da25-6c16-4330-9645-fa4c032c0ae8/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 778.040444] env[68906]: DEBUG oslo_vmware.rw_handles [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Completed reading data from the image iterator. {{(pid=68906) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 778.040612] env[68906]: DEBUG oslo_vmware.rw_handles [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/d348da25-6c16-4330-9645-fa4c032c0ae8/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 778.105098] env[68906]: DEBUG oslo_vmware.api [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Task: {'id': task-3475305, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.165523} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 778.105352] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Deleted the datastore file {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 778.105534] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] Deleted contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 778.105703] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 778.105872] env[68906]: INFO nova.compute.manager [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] Took 0.81 seconds to destroy the instance on the hypervisor. [ 778.107999] env[68906]: DEBUG nova.compute.claims [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] Aborting claim: {{(pid=68906) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 778.108195] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 778.108405] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 778.539572] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c99403b5-6a4e-4de1-b0f0-972aaedddb33 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 778.547333] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a685ddc-b457-43d3-85cb-13c13b6f2fc6 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 778.577740] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-883e1751-5a92-484f-8dac-d666ee87a842 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 778.585471] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d4bebf7-2099-4a9a-bf97-c4c720d7de6c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 778.599623] env[68906]: DEBUG nova.compute.provider_tree [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 778.608255] env[68906]: DEBUG nova.scheduler.client.report [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 778.625756] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.517s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 778.627065] env[68906]: ERROR nova.compute.manager [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 778.627065] env[68906]: Faults: ['InvalidArgument'] [ 778.627065] env[68906]: ERROR nova.compute.manager [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] Traceback (most recent call last): [ 778.627065] env[68906]: ERROR nova.compute.manager [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 778.627065] env[68906]: ERROR nova.compute.manager [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] self.driver.spawn(context, instance, image_meta, [ 778.627065] env[68906]: ERROR nova.compute.manager [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 778.627065] env[68906]: ERROR nova.compute.manager [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] self._vmops.spawn(context, instance, image_meta, injected_files, [ 778.627065] env[68906]: ERROR nova.compute.manager [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 778.627065] env[68906]: ERROR nova.compute.manager [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] self._fetch_image_if_missing(context, vi) [ 778.627065] env[68906]: ERROR nova.compute.manager [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 778.627065] env[68906]: ERROR nova.compute.manager [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] image_cache(vi, tmp_image_ds_loc) [ 778.627065] env[68906]: ERROR nova.compute.manager [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 778.627406] env[68906]: ERROR nova.compute.manager [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] vm_util.copy_virtual_disk( [ 778.627406] env[68906]: ERROR nova.compute.manager [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 778.627406] env[68906]: ERROR nova.compute.manager [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] session._wait_for_task(vmdk_copy_task) [ 778.627406] env[68906]: ERROR nova.compute.manager [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 778.627406] env[68906]: ERROR nova.compute.manager [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] return self.wait_for_task(task_ref) [ 778.627406] env[68906]: ERROR nova.compute.manager [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 778.627406] env[68906]: ERROR nova.compute.manager [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] return evt.wait() [ 778.627406] env[68906]: ERROR nova.compute.manager [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 778.627406] env[68906]: ERROR nova.compute.manager [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] result = hub.switch() [ 778.627406] env[68906]: ERROR nova.compute.manager [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 778.627406] env[68906]: ERROR nova.compute.manager [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] return self.greenlet.switch() [ 778.627406] env[68906]: ERROR nova.compute.manager [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 778.627406] env[68906]: ERROR nova.compute.manager [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] self.f(*self.args, **self.kw) [ 778.627699] env[68906]: ERROR nova.compute.manager [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 778.627699] env[68906]: ERROR nova.compute.manager [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] raise exceptions.translate_fault(task_info.error) [ 778.627699] env[68906]: ERROR nova.compute.manager [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 778.627699] env[68906]: ERROR nova.compute.manager [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] Faults: ['InvalidArgument'] [ 778.627699] env[68906]: ERROR nova.compute.manager [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] [ 778.627699] env[68906]: DEBUG nova.compute.utils [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] VimFaultException {{(pid=68906) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 778.629028] env[68906]: DEBUG nova.compute.manager [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] Build of instance 46481a4e-ac53-456d-b6cb-9f3ffbccf407 was re-scheduled: A specified parameter was not correct: fileType [ 778.629028] env[68906]: Faults: ['InvalidArgument'] {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 778.629211] env[68906]: DEBUG nova.compute.manager [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] Unplugging VIFs for instance {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 778.629387] env[68906]: DEBUG nova.compute.manager [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 778.629590] env[68906]: DEBUG nova.compute.manager [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 778.629717] env[68906]: DEBUG nova.network.neutron [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 779.133541] env[68906]: DEBUG nova.network.neutron [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 779.148717] env[68906]: INFO nova.compute.manager [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] [instance: 46481a4e-ac53-456d-b6cb-9f3ffbccf407] Took 0.52 seconds to deallocate network for instance. [ 779.259509] env[68906]: INFO nova.scheduler.client.report [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Deleted allocations for instance 46481a4e-ac53-456d-b6cb-9f3ffbccf407 [ 779.289889] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6e630c8f-2ec9-418e-9d9b-a4ca682ae7cd tempest-FloatingIPsAssociationTestJSON-1370684228 tempest-FloatingIPsAssociationTestJSON-1370684228-project-member] Lock "46481a4e-ac53-456d-b6cb-9f3ffbccf407" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 147.773s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 779.320618] env[68906]: DEBUG nova.compute.manager [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 779.380529] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 779.380529] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 779.382046] env[68906]: INFO nova.compute.claims [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 779.832077] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ff2d436-cf7c-4bdd-94bc-3b9d9c0ca04a {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 779.840085] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9499139b-c2ab-4d9a-ac0b-00302cee9463 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 779.871178] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6733e523-069c-42e1-ae7e-eefacd2a17ed {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 779.878908] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-37bf79c2-9f41-4d1b-8c6c-4962232a3b18 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 779.892249] env[68906]: DEBUG nova.compute.provider_tree [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 779.900738] env[68906]: DEBUG nova.scheduler.client.report [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 779.916872] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.537s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 779.919016] env[68906]: DEBUG nova.compute.manager [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Start building networks asynchronously for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 779.950066] env[68906]: DEBUG nova.compute.utils [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Using /dev/sd instead of None {{(pid=68906) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 779.952156] env[68906]: DEBUG nova.compute.manager [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Allocating IP information in the background. {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 779.952497] env[68906]: DEBUG nova.network.neutron [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] allocate_for_instance() {{(pid=68906) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 779.966245] env[68906]: DEBUG nova.compute.manager [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Start building block device mappings for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 780.033298] env[68906]: DEBUG nova.compute.manager [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Start spawning the instance on the hypervisor. {{(pid=68906) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 780.061213] env[68906]: DEBUG nova.virt.hardware [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T13:00:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T13:00:23Z,direct_url=,disk_format='vmdk',id=b1400c31-d33b-4e13-944f-4c645e62493e,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='1ae7bf3a375d41c6af5e7536af51ffd1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T13:00:24Z,virtual_size=,visibility=), allow threads: False {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 780.061717] env[68906]: DEBUG nova.virt.hardware [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Flavor limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 780.062030] env[68906]: DEBUG nova.virt.hardware [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Image limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 780.063246] env[68906]: DEBUG nova.virt.hardware [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Flavor pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 780.063246] env[68906]: DEBUG nova.virt.hardware [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Image pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 780.063246] env[68906]: DEBUG nova.virt.hardware [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 780.063246] env[68906]: DEBUG nova.virt.hardware [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 780.063246] env[68906]: DEBUG nova.virt.hardware [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 780.063419] env[68906]: DEBUG nova.virt.hardware [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Got 1 possible topologies {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 780.064058] env[68906]: DEBUG nova.virt.hardware [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 780.064058] env[68906]: DEBUG nova.virt.hardware [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 780.065128] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c0d89a6b-f256-44dc-8d82-402c40d07428 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 780.069678] env[68906]: DEBUG nova.policy [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e2ba5689487a4ed2b7cbe3ef6b418484', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd00ddd09cd1846b5b8be8980c125a562', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68906) authorize /opt/stack/nova/nova/policy.py:203}} [ 780.077021] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3149946f-eb84-42ae-8661-cf6be555a38a {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 780.902498] env[68906]: DEBUG nova.network.neutron [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Successfully created port: e381b776-4d9b-43e2-8259-362887d031b1 {{(pid=68906) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 782.007096] env[68906]: DEBUG nova.network.neutron [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Successfully updated port: e381b776-4d9b-43e2-8259-362887d031b1 {{(pid=68906) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 782.021364] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Acquiring lock "refresh_cache-a7e0a28f-42a5-442e-b962-07771d2e6a27" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 782.021718] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Acquired lock "refresh_cache-a7e0a28f-42a5-442e-b962-07771d2e6a27" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 782.021718] env[68906]: DEBUG nova.network.neutron [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Building network info cache for instance {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 782.122606] env[68906]: DEBUG nova.network.neutron [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Instance cache missing network info. {{(pid=68906) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 782.247816] env[68906]: DEBUG nova.compute.manager [req-08e80bee-cd66-4fb5-9e5c-ccfb6019989c req-744ed7ed-788f-4bd5-8070-70dab438683d service nova] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Received event network-vif-plugged-e381b776-4d9b-43e2-8259-362887d031b1 {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 782.248125] env[68906]: DEBUG oslo_concurrency.lockutils [req-08e80bee-cd66-4fb5-9e5c-ccfb6019989c req-744ed7ed-788f-4bd5-8070-70dab438683d service nova] Acquiring lock "a7e0a28f-42a5-442e-b962-07771d2e6a27-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 782.248276] env[68906]: DEBUG oslo_concurrency.lockutils [req-08e80bee-cd66-4fb5-9e5c-ccfb6019989c req-744ed7ed-788f-4bd5-8070-70dab438683d service nova] Lock "a7e0a28f-42a5-442e-b962-07771d2e6a27-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 782.248437] env[68906]: DEBUG oslo_concurrency.lockutils [req-08e80bee-cd66-4fb5-9e5c-ccfb6019989c req-744ed7ed-788f-4bd5-8070-70dab438683d service nova] Lock "a7e0a28f-42a5-442e-b962-07771d2e6a27-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 782.248616] env[68906]: DEBUG nova.compute.manager [req-08e80bee-cd66-4fb5-9e5c-ccfb6019989c req-744ed7ed-788f-4bd5-8070-70dab438683d service nova] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] No waiting events found dispatching network-vif-plugged-e381b776-4d9b-43e2-8259-362887d031b1 {{(pid=68906) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 782.248762] env[68906]: WARNING nova.compute.manager [req-08e80bee-cd66-4fb5-9e5c-ccfb6019989c req-744ed7ed-788f-4bd5-8070-70dab438683d service nova] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Received unexpected event network-vif-plugged-e381b776-4d9b-43e2-8259-362887d031b1 for instance with vm_state building and task_state spawning. [ 782.443251] env[68906]: DEBUG nova.network.neutron [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Updating instance_info_cache with network_info: [{"id": "e381b776-4d9b-43e2-8259-362887d031b1", "address": "fa:16:3e:c4:5e:ba", "network": {"id": "63efabfb-0028-4758-9626-5f9860440121", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.98", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "1ae7bf3a375d41c6af5e7536af51ffd1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "69054a13-b7ef-44e1-bd3b-3ca5ba602848", "external-id": "nsx-vlan-transportzone-153", "segmentation_id": 153, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape381b776-4d", "ovs_interfaceid": "e381b776-4d9b-43e2-8259-362887d031b1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 782.461336] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Releasing lock "refresh_cache-a7e0a28f-42a5-442e-b962-07771d2e6a27" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 782.461690] env[68906]: DEBUG nova.compute.manager [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Instance network_info: |[{"id": "e381b776-4d9b-43e2-8259-362887d031b1", "address": "fa:16:3e:c4:5e:ba", "network": {"id": "63efabfb-0028-4758-9626-5f9860440121", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.98", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "1ae7bf3a375d41c6af5e7536af51ffd1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "69054a13-b7ef-44e1-bd3b-3ca5ba602848", "external-id": "nsx-vlan-transportzone-153", "segmentation_id": 153, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape381b776-4d", "ovs_interfaceid": "e381b776-4d9b-43e2-8259-362887d031b1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 782.462364] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:c4:5e:ba', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '69054a13-b7ef-44e1-bd3b-3ca5ba602848', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'e381b776-4d9b-43e2-8259-362887d031b1', 'vif_model': 'vmxnet3'}] {{(pid=68906) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 782.478079] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Creating folder: Project (d00ddd09cd1846b5b8be8980c125a562). Parent ref: group-v694750. {{(pid=68906) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 782.478709] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-4c8dfe4e-ee94-425b-9710-acbbb3438566 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 782.489966] env[68906]: INFO nova.virt.vmwareapi.vm_util [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Created folder: Project (d00ddd09cd1846b5b8be8980c125a562) in parent group-v694750. [ 782.490181] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Creating folder: Instances. Parent ref: group-v694789. {{(pid=68906) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 782.490406] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-03e52480-ee48-4ed9-a2ad-a63d41607c69 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 782.499091] env[68906]: INFO nova.virt.vmwareapi.vm_util [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Created folder: Instances in parent group-v694789. [ 782.499530] env[68906]: DEBUG oslo.service.loopingcall [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 782.499530] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Creating VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 782.499701] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-d0e02ede-432e-4895-95de-23c7dd03ffdf {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 782.521888] env[68906]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 782.521888] env[68906]: value = "task-3475308" [ 782.521888] env[68906]: _type = "Task" [ 782.521888] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 782.529216] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475308, 'name': CreateVM_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 783.032583] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475308, 'name': CreateVM_Task, 'duration_secs': 0.298138} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 783.033029] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Created VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 783.033500] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 783.033665] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 783.033983] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 783.034286] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-527c831b-011a-46a7-bc41-51750dfafd18 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 783.038842] env[68906]: DEBUG oslo_vmware.api [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Waiting for the task: (returnval){ [ 783.038842] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]5289673c-8fc1-c334-dfb1-51ddfdace8ad" [ 783.038842] env[68906]: _type = "Task" [ 783.038842] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 783.047234] env[68906]: DEBUG oslo_vmware.api [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]5289673c-8fc1-c334-dfb1-51ddfdace8ad, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 783.550679] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 783.551066] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Processing image b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 783.551269] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 784.270554] env[68906]: DEBUG nova.compute.manager [req-9c2dd4ed-4e7c-4d83-995f-043f76e82b24 req-72d1cb63-2546-4a38-a685-d4d6d2ea824b service nova] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Received event network-changed-e381b776-4d9b-43e2-8259-362887d031b1 {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 784.270940] env[68906]: DEBUG nova.compute.manager [req-9c2dd4ed-4e7c-4d83-995f-043f76e82b24 req-72d1cb63-2546-4a38-a685-d4d6d2ea824b service nova] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Refreshing instance network info cache due to event network-changed-e381b776-4d9b-43e2-8259-362887d031b1. {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 784.271321] env[68906]: DEBUG oslo_concurrency.lockutils [req-9c2dd4ed-4e7c-4d83-995f-043f76e82b24 req-72d1cb63-2546-4a38-a685-d4d6d2ea824b service nova] Acquiring lock "refresh_cache-a7e0a28f-42a5-442e-b962-07771d2e6a27" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 784.271423] env[68906]: DEBUG oslo_concurrency.lockutils [req-9c2dd4ed-4e7c-4d83-995f-043f76e82b24 req-72d1cb63-2546-4a38-a685-d4d6d2ea824b service nova] Acquired lock "refresh_cache-a7e0a28f-42a5-442e-b962-07771d2e6a27" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 784.271554] env[68906]: DEBUG nova.network.neutron [req-9c2dd4ed-4e7c-4d83-995f-043f76e82b24 req-72d1cb63-2546-4a38-a685-d4d6d2ea824b service nova] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Refreshing network info cache for port e381b776-4d9b-43e2-8259-362887d031b1 {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 784.834532] env[68906]: DEBUG nova.network.neutron [req-9c2dd4ed-4e7c-4d83-995f-043f76e82b24 req-72d1cb63-2546-4a38-a685-d4d6d2ea824b service nova] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Updated VIF entry in instance network info cache for port e381b776-4d9b-43e2-8259-362887d031b1. {{(pid=68906) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 784.835174] env[68906]: DEBUG nova.network.neutron [req-9c2dd4ed-4e7c-4d83-995f-043f76e82b24 req-72d1cb63-2546-4a38-a685-d4d6d2ea824b service nova] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Updating instance_info_cache with network_info: [{"id": "e381b776-4d9b-43e2-8259-362887d031b1", "address": "fa:16:3e:c4:5e:ba", "network": {"id": "63efabfb-0028-4758-9626-5f9860440121", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.98", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "1ae7bf3a375d41c6af5e7536af51ffd1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "69054a13-b7ef-44e1-bd3b-3ca5ba602848", "external-id": "nsx-vlan-transportzone-153", "segmentation_id": 153, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape381b776-4d", "ovs_interfaceid": "e381b776-4d9b-43e2-8259-362887d031b1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 784.848737] env[68906]: DEBUG oslo_concurrency.lockutils [req-9c2dd4ed-4e7c-4d83-995f-043f76e82b24 req-72d1cb63-2546-4a38-a685-d4d6d2ea824b service nova] Releasing lock "refresh_cache-a7e0a28f-42a5-442e-b962-07771d2e6a27" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 790.785819] env[68906]: DEBUG oslo_concurrency.lockutils [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Acquiring lock "e7286888-d79d-4632-9c06-69c1ef47fa50" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 790.785819] env[68906]: DEBUG oslo_concurrency.lockutils [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Lock "e7286888-d79d-4632-9c06-69c1ef47fa50" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 812.994900] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 813.140355] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 814.141262] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 815.135653] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 815.140487] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 816.140430] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 816.140686] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Starting heal instance info cache {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 816.140758] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Rebuilding the list of instances to heal {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 816.161174] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: d2258ded-478a-4530-b940-386286702048] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 816.161334] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: da0c4340-a657-43bd-9a98-4c8f50add720] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 816.161466] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 816.161593] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 816.161769] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 816.161856] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 816.161983] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 816.162120] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 816.162240] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 816.162357] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 816.162476] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Didn't find any instances for network info cache update. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 816.162953] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 816.163153] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 816.163288] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68906) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 817.141268] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 818.140619] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager.update_available_resource {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 818.153671] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 818.154025] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 818.154133] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 818.156021] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68906) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 818.156021] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e1d77b8-e1af-4414-9e2c-5e717356bf7c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 818.164466] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2a7f1058-bf16-447c-953b-6e6958114910 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 818.178770] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-80f5a9b9-989b-4ce3-86f1-d79875139eb8 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 818.186019] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f9ef37a7-fc5a-4825-959f-5ac3294684a5 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 818.215694] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180945MB free_disk=93GB free_vcpus=48 pci_devices=None {{(pid=68906) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 818.215888] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 818.216062] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 818.288028] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance d2258ded-478a-4530-b940-386286702048 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 818.288028] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance da0c4340-a657-43bd-9a98-4c8f50add720 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 818.288028] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 0540a4dc-1b86-4776-b633-f540af168a2b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 818.288199] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 4edb8b9f-b608-4be8-bfd3-65642710f9bd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 818.288236] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance d6ca51b9-b284-405c-878e-fdbc326b73e1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 818.288340] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance f42056e5-52cb-4d69-8022-ca643c49194e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 818.288458] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance ce63789a-1f0f-40ca-8368-ac3f84bb58cd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 818.288570] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 13eebe4e-5984-46c3-bb73-cd783ad45df6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 818.289468] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 9a2d2803-34b1-40f7-9349-e5734a217e18 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 818.289468] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance a7e0a28f-42a5-442e-b962-07771d2e6a27 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 818.302243] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance eb81e9b1-b573-4d7c-9ede-f8b32a43a201 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 818.313493] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 12724be5-cfb1-4cf6-b98b-b4142da21714 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 818.323746] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance b3bd0ecb-f329-48f3-b48b-25751262a5eb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 818.336020] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance e63fba5c-46fd-494c-9aec-dd76f12974d7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 818.345735] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance c02f41e2-8a99-4f18-9d86-82fa702bb2b6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 818.358259] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance a2414623-7871-4706-81db-7d15ca74fdab has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 818.367628] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 252028a3-3d3e-44c5-9c51-26752962a90d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 818.377446] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 5ebd4d05-ddb3-4001-a526-a0c96b081818 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 818.387426] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance c2e2265b-aef3-4a8c-ae03-314e679af64b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 818.398861] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 3ce9d4bd-3d7a-4191-9d3e-892efc81e8a2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 818.408375] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 3f26342e-89a8-4218-8875-8411eb8b16a0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 818.418265] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 1fb9796e-e0d4-410d-bff1-a6a44b2a3580 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 818.428587] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance b77ff68e-350b-4f65-bb62-dfb727281e50 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 818.437870] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 29e5aa99-4e20-4b6f-a749-544b8c41a713 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 818.453019] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 20fa65c1-9ea0-4dc2-828e-8477c9f45baa has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 818.462362] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 98844da1-0e2a-46b5-8e72-c0f8dcd29b27 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 818.474020] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 818.482722] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 653c016d-c596-4f45-a18e-55f2d1935166 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 818.492372] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 627c0227-72ca-4a77-aca1-bc3112955e7a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 818.502028] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 03e8dff3-b6b8-4754-8725-dddc9f9e6216 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 818.512102] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance d3c5fdf4-a775-4b88-9bc2-ce9f31a9e6ac has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 818.523541] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 242433e2-5b59-4b19-ba8d-80432ee4b7b7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 818.534319] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance acc11633-a489-4d8f-ad76-f17049a91545 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 818.543991] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 0874bf05-e156-404e-a067-869e370fd14b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 818.554919] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance e7286888-d79d-4632-9c06-69c1ef47fa50 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 818.554919] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 818.554919] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 818.962098] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf540996-b451-4ad9-acb4-df8cb9b5b987 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 818.969648] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d7ba5548-f420-4830-8dfd-e69f5b6c383c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 818.999760] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ea41d83-1604-4025-b90c-4fad355f7990 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 819.006540] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ffc13443-a2af-46d5-89db-3bfaab794118 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 819.018946] env[68906]: DEBUG nova.compute.provider_tree [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 819.027137] env[68906]: DEBUG nova.scheduler.client.report [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 819.040580] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68906) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 819.040580] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.824s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 827.539630] env[68906]: WARNING oslo_vmware.rw_handles [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 827.539630] env[68906]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 827.539630] env[68906]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 827.539630] env[68906]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 827.539630] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 827.539630] env[68906]: ERROR oslo_vmware.rw_handles response.begin() [ 827.539630] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 827.539630] env[68906]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 827.539630] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 827.539630] env[68906]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 827.539630] env[68906]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 827.539630] env[68906]: ERROR oslo_vmware.rw_handles [ 827.540265] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] [instance: d2258ded-478a-4530-b940-386286702048] Downloaded image file data b1400c31-d33b-4e13-944f-4c645e62493e to vmware_temp/d348da25-6c16-4330-9645-fa4c032c0ae8/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 827.541732] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] [instance: d2258ded-478a-4530-b940-386286702048] Caching image {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 827.541984] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Copying Virtual Disk [datastore2] vmware_temp/d348da25-6c16-4330-9645-fa4c032c0ae8/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk to [datastore2] vmware_temp/d348da25-6c16-4330-9645-fa4c032c0ae8/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk {{(pid=68906) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 827.542341] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-e7ae2c88-bb8d-4596-bf44-4ac188ef6f62 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 827.551535] env[68906]: DEBUG oslo_vmware.api [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Waiting for the task: (returnval){ [ 827.551535] env[68906]: value = "task-3475309" [ 827.551535] env[68906]: _type = "Task" [ 827.551535] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 827.559312] env[68906]: DEBUG oslo_vmware.api [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Task: {'id': task-3475309, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 828.063763] env[68906]: DEBUG oslo_vmware.exceptions [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Fault InvalidArgument not matched. {{(pid=68906) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 828.064059] env[68906]: DEBUG oslo_concurrency.lockutils [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 828.064639] env[68906]: ERROR nova.compute.manager [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] [instance: d2258ded-478a-4530-b940-386286702048] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 828.064639] env[68906]: Faults: ['InvalidArgument'] [ 828.064639] env[68906]: ERROR nova.compute.manager [instance: d2258ded-478a-4530-b940-386286702048] Traceback (most recent call last): [ 828.064639] env[68906]: ERROR nova.compute.manager [instance: d2258ded-478a-4530-b940-386286702048] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 828.064639] env[68906]: ERROR nova.compute.manager [instance: d2258ded-478a-4530-b940-386286702048] yield resources [ 828.064639] env[68906]: ERROR nova.compute.manager [instance: d2258ded-478a-4530-b940-386286702048] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 828.064639] env[68906]: ERROR nova.compute.manager [instance: d2258ded-478a-4530-b940-386286702048] self.driver.spawn(context, instance, image_meta, [ 828.064639] env[68906]: ERROR nova.compute.manager [instance: d2258ded-478a-4530-b940-386286702048] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 828.064639] env[68906]: ERROR nova.compute.manager [instance: d2258ded-478a-4530-b940-386286702048] self._vmops.spawn(context, instance, image_meta, injected_files, [ 828.064639] env[68906]: ERROR nova.compute.manager [instance: d2258ded-478a-4530-b940-386286702048] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 828.064639] env[68906]: ERROR nova.compute.manager [instance: d2258ded-478a-4530-b940-386286702048] self._fetch_image_if_missing(context, vi) [ 828.064639] env[68906]: ERROR nova.compute.manager [instance: d2258ded-478a-4530-b940-386286702048] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 828.064994] env[68906]: ERROR nova.compute.manager [instance: d2258ded-478a-4530-b940-386286702048] image_cache(vi, tmp_image_ds_loc) [ 828.064994] env[68906]: ERROR nova.compute.manager [instance: d2258ded-478a-4530-b940-386286702048] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 828.064994] env[68906]: ERROR nova.compute.manager [instance: d2258ded-478a-4530-b940-386286702048] vm_util.copy_virtual_disk( [ 828.064994] env[68906]: ERROR nova.compute.manager [instance: d2258ded-478a-4530-b940-386286702048] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 828.064994] env[68906]: ERROR nova.compute.manager [instance: d2258ded-478a-4530-b940-386286702048] session._wait_for_task(vmdk_copy_task) [ 828.064994] env[68906]: ERROR nova.compute.manager [instance: d2258ded-478a-4530-b940-386286702048] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 828.064994] env[68906]: ERROR nova.compute.manager [instance: d2258ded-478a-4530-b940-386286702048] return self.wait_for_task(task_ref) [ 828.064994] env[68906]: ERROR nova.compute.manager [instance: d2258ded-478a-4530-b940-386286702048] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 828.064994] env[68906]: ERROR nova.compute.manager [instance: d2258ded-478a-4530-b940-386286702048] return evt.wait() [ 828.064994] env[68906]: ERROR nova.compute.manager [instance: d2258ded-478a-4530-b940-386286702048] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 828.064994] env[68906]: ERROR nova.compute.manager [instance: d2258ded-478a-4530-b940-386286702048] result = hub.switch() [ 828.064994] env[68906]: ERROR nova.compute.manager [instance: d2258ded-478a-4530-b940-386286702048] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 828.064994] env[68906]: ERROR nova.compute.manager [instance: d2258ded-478a-4530-b940-386286702048] return self.greenlet.switch() [ 828.065416] env[68906]: ERROR nova.compute.manager [instance: d2258ded-478a-4530-b940-386286702048] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 828.065416] env[68906]: ERROR nova.compute.manager [instance: d2258ded-478a-4530-b940-386286702048] self.f(*self.args, **self.kw) [ 828.065416] env[68906]: ERROR nova.compute.manager [instance: d2258ded-478a-4530-b940-386286702048] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 828.065416] env[68906]: ERROR nova.compute.manager [instance: d2258ded-478a-4530-b940-386286702048] raise exceptions.translate_fault(task_info.error) [ 828.065416] env[68906]: ERROR nova.compute.manager [instance: d2258ded-478a-4530-b940-386286702048] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 828.065416] env[68906]: ERROR nova.compute.manager [instance: d2258ded-478a-4530-b940-386286702048] Faults: ['InvalidArgument'] [ 828.065416] env[68906]: ERROR nova.compute.manager [instance: d2258ded-478a-4530-b940-386286702048] [ 828.065416] env[68906]: INFO nova.compute.manager [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] [instance: d2258ded-478a-4530-b940-386286702048] Terminating instance [ 828.066587] env[68906]: DEBUG oslo_concurrency.lockutils [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 828.066803] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 828.067877] env[68906]: DEBUG nova.compute.manager [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] [instance: d2258ded-478a-4530-b940-386286702048] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 828.067877] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] [instance: d2258ded-478a-4530-b940-386286702048] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 828.067877] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2ef788b0-38f1-41cc-a8f6-219b897a700c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 828.070478] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-89cddc18-1443-46bb-8e95-2e807dd6f9ae {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 828.078578] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] [instance: d2258ded-478a-4530-b940-386286702048] Unregistering the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 828.078838] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-d8844c17-9cb9-4adf-b2e6-7081dcf86bfa {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 828.081431] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 828.081606] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68906) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 828.082667] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5eb33301-e2ee-4269-84f1-908544c30d09 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 828.087997] env[68906]: DEBUG oslo_vmware.api [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Waiting for the task: (returnval){ [ 828.087997] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]52e4acf9-ed4a-4c51-5bd4-901554cf6d50" [ 828.087997] env[68906]: _type = "Task" [ 828.087997] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 828.097018] env[68906]: DEBUG oslo_vmware.api [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]52e4acf9-ed4a-4c51-5bd4-901554cf6d50, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 828.166435] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] [instance: d2258ded-478a-4530-b940-386286702048] Unregistered the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 828.166694] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] [instance: d2258ded-478a-4530-b940-386286702048] Deleting contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 828.167068] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Deleting the datastore file [datastore2] d2258ded-478a-4530-b940-386286702048 {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 828.167288] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-8aaaca05-3662-4ab1-b844-87305201baf7 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 828.174118] env[68906]: DEBUG oslo_vmware.api [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Waiting for the task: (returnval){ [ 828.174118] env[68906]: value = "task-3475311" [ 828.174118] env[68906]: _type = "Task" [ 828.174118] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 828.182757] env[68906]: DEBUG oslo_vmware.api [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Task: {'id': task-3475311, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 828.599857] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] [instance: da0c4340-a657-43bd-9a98-4c8f50add720] Preparing fetch location {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 828.600302] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Creating directory with path [datastore2] vmware_temp/c9595509-74ef-458d-a32f-7559174cf261/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 828.600563] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e47ed835-0371-43ba-b919-ee08369e45f2 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 828.613227] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Created directory with path [datastore2] vmware_temp/c9595509-74ef-458d-a32f-7559174cf261/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 828.613512] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] [instance: da0c4340-a657-43bd-9a98-4c8f50add720] Fetch image to [datastore2] vmware_temp/c9595509-74ef-458d-a32f-7559174cf261/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 828.613686] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] [instance: da0c4340-a657-43bd-9a98-4c8f50add720] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to [datastore2] vmware_temp/c9595509-74ef-458d-a32f-7559174cf261/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 828.614486] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-382b6bba-9f96-4266-9b40-5a90c4f67b2f {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 828.621699] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d8900b0-ac89-44d7-b12a-d5693855d06b {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 828.631556] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f1eba2f4-7ba4-4d25-9ce5-2dee7a7ae822 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 828.665489] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-891051d7-5fcb-4362-8e6d-994df8c52022 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 828.671893] env[68906]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-19e851bc-6fe3-43c2-b519-ff13c47f471c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 828.683241] env[68906]: DEBUG oslo_vmware.api [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Task: {'id': task-3475311, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.08243} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 828.683504] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Deleted the datastore file {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 828.683800] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] [instance: d2258ded-478a-4530-b940-386286702048] Deleted contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 828.683897] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] [instance: d2258ded-478a-4530-b940-386286702048] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 828.684084] env[68906]: INFO nova.compute.manager [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] [instance: d2258ded-478a-4530-b940-386286702048] Took 0.62 seconds to destroy the instance on the hypervisor. [ 828.686377] env[68906]: DEBUG nova.compute.claims [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] [instance: d2258ded-478a-4530-b940-386286702048] Aborting claim: {{(pid=68906) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 828.686551] env[68906]: DEBUG oslo_concurrency.lockutils [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 828.686868] env[68906]: DEBUG oslo_concurrency.lockutils [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 828.699354] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] [instance: da0c4340-a657-43bd-9a98-4c8f50add720] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 828.756897] env[68906]: DEBUG oslo_vmware.rw_handles [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c9595509-74ef-458d-a32f-7559174cf261/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 828.818725] env[68906]: DEBUG oslo_vmware.rw_handles [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Completed reading data from the image iterator. {{(pid=68906) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 828.819039] env[68906]: DEBUG oslo_vmware.rw_handles [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c9595509-74ef-458d-a32f-7559174cf261/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 829.255227] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4e4a7eeb-6dcc-4906-85d2-93aecd002ce2 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 829.262803] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-64efd5c8-9031-4ebc-b94e-4feef3c0d371 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 829.293255] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bad8f5f2-5805-4360-9f94-f6618077ea18 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 829.300967] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc8aa003-1221-4098-884a-702f1914f0ef {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 829.313987] env[68906]: DEBUG nova.compute.provider_tree [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 829.322106] env[68906]: DEBUG nova.scheduler.client.report [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 829.336054] env[68906]: DEBUG oslo_concurrency.lockutils [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.649s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 829.336599] env[68906]: ERROR nova.compute.manager [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] [instance: d2258ded-478a-4530-b940-386286702048] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 829.336599] env[68906]: Faults: ['InvalidArgument'] [ 829.336599] env[68906]: ERROR nova.compute.manager [instance: d2258ded-478a-4530-b940-386286702048] Traceback (most recent call last): [ 829.336599] env[68906]: ERROR nova.compute.manager [instance: d2258ded-478a-4530-b940-386286702048] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 829.336599] env[68906]: ERROR nova.compute.manager [instance: d2258ded-478a-4530-b940-386286702048] self.driver.spawn(context, instance, image_meta, [ 829.336599] env[68906]: ERROR nova.compute.manager [instance: d2258ded-478a-4530-b940-386286702048] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 829.336599] env[68906]: ERROR nova.compute.manager [instance: d2258ded-478a-4530-b940-386286702048] self._vmops.spawn(context, instance, image_meta, injected_files, [ 829.336599] env[68906]: ERROR nova.compute.manager [instance: d2258ded-478a-4530-b940-386286702048] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 829.336599] env[68906]: ERROR nova.compute.manager [instance: d2258ded-478a-4530-b940-386286702048] self._fetch_image_if_missing(context, vi) [ 829.336599] env[68906]: ERROR nova.compute.manager [instance: d2258ded-478a-4530-b940-386286702048] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 829.336599] env[68906]: ERROR nova.compute.manager [instance: d2258ded-478a-4530-b940-386286702048] image_cache(vi, tmp_image_ds_loc) [ 829.336599] env[68906]: ERROR nova.compute.manager [instance: d2258ded-478a-4530-b940-386286702048] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 829.336918] env[68906]: ERROR nova.compute.manager [instance: d2258ded-478a-4530-b940-386286702048] vm_util.copy_virtual_disk( [ 829.336918] env[68906]: ERROR nova.compute.manager [instance: d2258ded-478a-4530-b940-386286702048] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 829.336918] env[68906]: ERROR nova.compute.manager [instance: d2258ded-478a-4530-b940-386286702048] session._wait_for_task(vmdk_copy_task) [ 829.336918] env[68906]: ERROR nova.compute.manager [instance: d2258ded-478a-4530-b940-386286702048] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 829.336918] env[68906]: ERROR nova.compute.manager [instance: d2258ded-478a-4530-b940-386286702048] return self.wait_for_task(task_ref) [ 829.336918] env[68906]: ERROR nova.compute.manager [instance: d2258ded-478a-4530-b940-386286702048] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 829.336918] env[68906]: ERROR nova.compute.manager [instance: d2258ded-478a-4530-b940-386286702048] return evt.wait() [ 829.336918] env[68906]: ERROR nova.compute.manager [instance: d2258ded-478a-4530-b940-386286702048] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 829.336918] env[68906]: ERROR nova.compute.manager [instance: d2258ded-478a-4530-b940-386286702048] result = hub.switch() [ 829.336918] env[68906]: ERROR nova.compute.manager [instance: d2258ded-478a-4530-b940-386286702048] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 829.336918] env[68906]: ERROR nova.compute.manager [instance: d2258ded-478a-4530-b940-386286702048] return self.greenlet.switch() [ 829.336918] env[68906]: ERROR nova.compute.manager [instance: d2258ded-478a-4530-b940-386286702048] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 829.336918] env[68906]: ERROR nova.compute.manager [instance: d2258ded-478a-4530-b940-386286702048] self.f(*self.args, **self.kw) [ 829.337439] env[68906]: ERROR nova.compute.manager [instance: d2258ded-478a-4530-b940-386286702048] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 829.337439] env[68906]: ERROR nova.compute.manager [instance: d2258ded-478a-4530-b940-386286702048] raise exceptions.translate_fault(task_info.error) [ 829.337439] env[68906]: ERROR nova.compute.manager [instance: d2258ded-478a-4530-b940-386286702048] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 829.337439] env[68906]: ERROR nova.compute.manager [instance: d2258ded-478a-4530-b940-386286702048] Faults: ['InvalidArgument'] [ 829.337439] env[68906]: ERROR nova.compute.manager [instance: d2258ded-478a-4530-b940-386286702048] [ 829.337439] env[68906]: DEBUG nova.compute.utils [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] [instance: d2258ded-478a-4530-b940-386286702048] VimFaultException {{(pid=68906) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 829.338762] env[68906]: DEBUG nova.compute.manager [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] [instance: d2258ded-478a-4530-b940-386286702048] Build of instance d2258ded-478a-4530-b940-386286702048 was re-scheduled: A specified parameter was not correct: fileType [ 829.338762] env[68906]: Faults: ['InvalidArgument'] {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 829.339135] env[68906]: DEBUG nova.compute.manager [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] [instance: d2258ded-478a-4530-b940-386286702048] Unplugging VIFs for instance {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 829.339308] env[68906]: DEBUG nova.compute.manager [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 829.339474] env[68906]: DEBUG nova.compute.manager [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] [instance: d2258ded-478a-4530-b940-386286702048] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 829.339634] env[68906]: DEBUG nova.network.neutron [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] [instance: d2258ded-478a-4530-b940-386286702048] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 829.842289] env[68906]: DEBUG nova.network.neutron [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] [instance: d2258ded-478a-4530-b940-386286702048] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 829.851920] env[68906]: INFO nova.compute.manager [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] [instance: d2258ded-478a-4530-b940-386286702048] Took 0.51 seconds to deallocate network for instance. [ 829.956716] env[68906]: INFO nova.scheduler.client.report [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Deleted allocations for instance d2258ded-478a-4530-b940-386286702048 [ 829.983723] env[68906]: DEBUG oslo_concurrency.lockutils [None req-77526f0d-ba0d-4648-acf7-33789433c0ca tempest-ImagesNegativeTestJSON-533604680 tempest-ImagesNegativeTestJSON-533604680-project-member] Lock "d2258ded-478a-4530-b940-386286702048" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 198.427s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 829.998184] env[68906]: DEBUG nova.compute.manager [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 830.048597] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 830.048597] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 830.049318] env[68906]: INFO nova.compute.claims [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 830.487626] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dc2c7ead-b28b-4512-ac0d-54cc1efa72c0 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 830.495387] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67639cf5-e686-40be-9d28-266f572ddde7 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 830.525188] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-264a2518-2237-4927-986e-bfbd9329c9c2 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 830.532430] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db2e556f-2a23-4bc2-a57d-9a37fea59e50 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 830.545584] env[68906]: DEBUG nova.compute.provider_tree [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 830.554467] env[68906]: DEBUG nova.scheduler.client.report [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 830.569721] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.522s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 830.570215] env[68906]: DEBUG nova.compute.manager [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Start building networks asynchronously for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 830.603028] env[68906]: DEBUG nova.compute.utils [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Using /dev/sd instead of None {{(pid=68906) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 830.604082] env[68906]: DEBUG nova.compute.manager [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Not allocating networking since 'none' was specified. {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 830.612590] env[68906]: DEBUG nova.compute.manager [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Start building block device mappings for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 830.679023] env[68906]: DEBUG nova.compute.manager [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Start spawning the instance on the hypervisor. {{(pid=68906) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 830.706288] env[68906]: DEBUG nova.virt.hardware [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T13:00:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T13:00:23Z,direct_url=,disk_format='vmdk',id=b1400c31-d33b-4e13-944f-4c645e62493e,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='1ae7bf3a375d41c6af5e7536af51ffd1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T13:00:24Z,virtual_size=,visibility=), allow threads: False {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 830.706533] env[68906]: DEBUG nova.virt.hardware [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Flavor limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 830.706688] env[68906]: DEBUG nova.virt.hardware [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Image limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 830.706864] env[68906]: DEBUG nova.virt.hardware [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Flavor pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 830.707050] env[68906]: DEBUG nova.virt.hardware [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Image pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 830.707273] env[68906]: DEBUG nova.virt.hardware [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 830.707502] env[68906]: DEBUG nova.virt.hardware [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 830.707662] env[68906]: DEBUG nova.virt.hardware [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 830.707830] env[68906]: DEBUG nova.virt.hardware [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Got 1 possible topologies {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 830.707988] env[68906]: DEBUG nova.virt.hardware [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 830.708181] env[68906]: DEBUG nova.virt.hardware [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 830.709028] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4d806a8d-fd0a-4f13-b105-fe01ebecf82f {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 830.717144] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3a0d0351-cb50-4ad6-8f90-a7e3a37e0300 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 830.730895] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Instance VIF info [] {{(pid=68906) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 830.736365] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Creating folder: Project (4d3487431d1b4ba191095fedb8ef3eb2). Parent ref: group-v694750. {{(pid=68906) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 830.736619] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-ab0c0bed-1427-4a68-8a24-651afbfc973d {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 830.746358] env[68906]: INFO nova.virt.vmwareapi.vm_util [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Created folder: Project (4d3487431d1b4ba191095fedb8ef3eb2) in parent group-v694750. [ 830.746569] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Creating folder: Instances. Parent ref: group-v694792. {{(pid=68906) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 830.746748] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-10c09bd1-9aca-4088-b726-e9d55c1d660c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 830.755317] env[68906]: INFO nova.virt.vmwareapi.vm_util [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Created folder: Instances in parent group-v694792. [ 830.755560] env[68906]: DEBUG oslo.service.loopingcall [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 830.755819] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Creating VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 830.755920] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-12873e07-36a6-4f65-8319-e23997d1ec90 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 830.772922] env[68906]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 830.772922] env[68906]: value = "task-3475314" [ 830.772922] env[68906]: _type = "Task" [ 830.772922] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 830.780201] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475314, 'name': CreateVM_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 830.953709] env[68906]: DEBUG oslo_concurrency.lockutils [None req-74c68a04-99ad-4af6-a0d0-f24a0942ce19 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Acquiring lock "da0c4340-a657-43bd-9a98-4c8f50add720" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 831.284866] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475314, 'name': CreateVM_Task, 'duration_secs': 0.245241} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 831.285065] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Created VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 831.285480] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 831.285638] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 831.285980] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 831.286245] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a7a8d728-1018-4b02-9d6b-2eb908b18856 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 831.290980] env[68906]: DEBUG oslo_vmware.api [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Waiting for the task: (returnval){ [ 831.290980] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]52ca7a93-4bc7-842e-56c9-b197b99ef158" [ 831.290980] env[68906]: _type = "Task" [ 831.290980] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 831.300583] env[68906]: DEBUG oslo_vmware.api [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]52ca7a93-4bc7-842e-56c9-b197b99ef158, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 831.802623] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 831.803182] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Processing image b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 831.803257] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 837.218209] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Acquiring lock "641cca5b-d749-4331-a5e0-8acb6d47cba2" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 837.218505] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Lock "641cca5b-d749-4331-a5e0-8acb6d47cba2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 839.312835] env[68906]: DEBUG oslo_concurrency.lockutils [None req-cb0f3ac9-ddb7-4977-ac45-2a85841a42ce tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Acquiring lock "0540a4dc-1b86-4776-b633-f540af168a2b" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 841.308713] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6333a4bb-d4a4-44d7-b760-cbd7cf2f16fd tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Acquiring lock "4edb8b9f-b608-4be8-bfd3-65642710f9bd" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 842.150470] env[68906]: DEBUG oslo_concurrency.lockutils [None req-018fe1e0-c886-4e1d-931e-35ad80c838d5 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Acquiring lock "d6ca51b9-b284-405c-878e-fdbc326b73e1" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 844.957104] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8567c8ff-035a-4e6a-9e89-f6bf5c0abf8d tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Acquiring lock "ce63789a-1f0f-40ca-8368-ac3f84bb58cd" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 845.825603] env[68906]: DEBUG oslo_concurrency.lockutils [None req-3688da09-dcdf-4b50-97dc-63e8cf24e699 tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Acquiring lock "9a2d2803-34b1-40f7-9349-e5734a217e18" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 846.359198] env[68906]: DEBUG oslo_concurrency.lockutils [None req-0c4881c2-db84-41d7-ad4c-edd06f5d686e tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Acquiring lock "13eebe4e-5984-46c3-bb73-cd783ad45df6" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 849.246384] env[68906]: DEBUG oslo_concurrency.lockutils [None req-da55a776-c1b9-49fd-b283-78bee6c2271a tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Acquiring lock "a7e0a28f-42a5-442e-b962-07771d2e6a27" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 853.051180] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8f8aa184-d11f-40ed-b40e-e09ca2f6ed86 tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Acquiring lock "eb81e9b1-b573-4d7c-9ede-f8b32a43a201" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 866.972821] env[68906]: DEBUG oslo_concurrency.lockutils [None req-d970c352-17f4-4e98-8d45-165ae6d79067 tempest-ServersTestMultiNic-1243959320 tempest-ServersTestMultiNic-1243959320-project-member] Acquiring lock "6d7b4648-a12f-4c3c-8465-b8fb37eb0d3c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 866.973678] env[68906]: DEBUG oslo_concurrency.lockutils [None req-d970c352-17f4-4e98-8d45-165ae6d79067 tempest-ServersTestMultiNic-1243959320 tempest-ServersTestMultiNic-1243959320-project-member] Lock "6d7b4648-a12f-4c3c-8465-b8fb37eb0d3c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 872.140636] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 872.140924] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Cleaning up deleted instances {{(pid=68906) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 872.157180] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] There are 0 instances to clean {{(pid=68906) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 872.158441] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 872.158824] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Cleaning up deleted instances with incomplete migration {{(pid=68906) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 872.172281] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 873.188361] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 874.140756] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 875.152168] env[68906]: WARNING oslo_vmware.rw_handles [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 875.152168] env[68906]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 875.152168] env[68906]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 875.152168] env[68906]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 875.152168] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 875.152168] env[68906]: ERROR oslo_vmware.rw_handles response.begin() [ 875.152168] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 875.152168] env[68906]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 875.152168] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 875.152168] env[68906]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 875.152168] env[68906]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 875.152168] env[68906]: ERROR oslo_vmware.rw_handles [ 875.153852] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] [instance: da0c4340-a657-43bd-9a98-4c8f50add720] Downloaded image file data b1400c31-d33b-4e13-944f-4c645e62493e to vmware_temp/c9595509-74ef-458d-a32f-7559174cf261/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 875.154418] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] [instance: da0c4340-a657-43bd-9a98-4c8f50add720] Caching image {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 875.154726] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Copying Virtual Disk [datastore2] vmware_temp/c9595509-74ef-458d-a32f-7559174cf261/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk to [datastore2] vmware_temp/c9595509-74ef-458d-a32f-7559174cf261/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk {{(pid=68906) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 875.157381] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-060ec8df-f1e3-434e-9bcb-d6375f8daa68 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 875.168323] env[68906]: DEBUG oslo_vmware.api [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Waiting for the task: (returnval){ [ 875.168323] env[68906]: value = "task-3475315" [ 875.168323] env[68906]: _type = "Task" [ 875.168323] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 875.177917] env[68906]: DEBUG oslo_vmware.api [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Task: {'id': task-3475315, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 875.684621] env[68906]: DEBUG oslo_vmware.exceptions [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Fault InvalidArgument not matched. {{(pid=68906) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 875.685091] env[68906]: DEBUG oslo_concurrency.lockutils [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 875.686239] env[68906]: ERROR nova.compute.manager [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] [instance: da0c4340-a657-43bd-9a98-4c8f50add720] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 875.686239] env[68906]: Faults: ['InvalidArgument'] [ 875.686239] env[68906]: ERROR nova.compute.manager [instance: da0c4340-a657-43bd-9a98-4c8f50add720] Traceback (most recent call last): [ 875.686239] env[68906]: ERROR nova.compute.manager [instance: da0c4340-a657-43bd-9a98-4c8f50add720] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 875.686239] env[68906]: ERROR nova.compute.manager [instance: da0c4340-a657-43bd-9a98-4c8f50add720] yield resources [ 875.686239] env[68906]: ERROR nova.compute.manager [instance: da0c4340-a657-43bd-9a98-4c8f50add720] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 875.686239] env[68906]: ERROR nova.compute.manager [instance: da0c4340-a657-43bd-9a98-4c8f50add720] self.driver.spawn(context, instance, image_meta, [ 875.686239] env[68906]: ERROR nova.compute.manager [instance: da0c4340-a657-43bd-9a98-4c8f50add720] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 875.686239] env[68906]: ERROR nova.compute.manager [instance: da0c4340-a657-43bd-9a98-4c8f50add720] self._vmops.spawn(context, instance, image_meta, injected_files, [ 875.686239] env[68906]: ERROR nova.compute.manager [instance: da0c4340-a657-43bd-9a98-4c8f50add720] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 875.686239] env[68906]: ERROR nova.compute.manager [instance: da0c4340-a657-43bd-9a98-4c8f50add720] self._fetch_image_if_missing(context, vi) [ 875.686239] env[68906]: ERROR nova.compute.manager [instance: da0c4340-a657-43bd-9a98-4c8f50add720] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 875.686703] env[68906]: ERROR nova.compute.manager [instance: da0c4340-a657-43bd-9a98-4c8f50add720] image_cache(vi, tmp_image_ds_loc) [ 875.686703] env[68906]: ERROR nova.compute.manager [instance: da0c4340-a657-43bd-9a98-4c8f50add720] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 875.686703] env[68906]: ERROR nova.compute.manager [instance: da0c4340-a657-43bd-9a98-4c8f50add720] vm_util.copy_virtual_disk( [ 875.686703] env[68906]: ERROR nova.compute.manager [instance: da0c4340-a657-43bd-9a98-4c8f50add720] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 875.686703] env[68906]: ERROR nova.compute.manager [instance: da0c4340-a657-43bd-9a98-4c8f50add720] session._wait_for_task(vmdk_copy_task) [ 875.686703] env[68906]: ERROR nova.compute.manager [instance: da0c4340-a657-43bd-9a98-4c8f50add720] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 875.686703] env[68906]: ERROR nova.compute.manager [instance: da0c4340-a657-43bd-9a98-4c8f50add720] return self.wait_for_task(task_ref) [ 875.686703] env[68906]: ERROR nova.compute.manager [instance: da0c4340-a657-43bd-9a98-4c8f50add720] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 875.686703] env[68906]: ERROR nova.compute.manager [instance: da0c4340-a657-43bd-9a98-4c8f50add720] return evt.wait() [ 875.686703] env[68906]: ERROR nova.compute.manager [instance: da0c4340-a657-43bd-9a98-4c8f50add720] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 875.686703] env[68906]: ERROR nova.compute.manager [instance: da0c4340-a657-43bd-9a98-4c8f50add720] result = hub.switch() [ 875.686703] env[68906]: ERROR nova.compute.manager [instance: da0c4340-a657-43bd-9a98-4c8f50add720] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 875.686703] env[68906]: ERROR nova.compute.manager [instance: da0c4340-a657-43bd-9a98-4c8f50add720] return self.greenlet.switch() [ 875.687074] env[68906]: ERROR nova.compute.manager [instance: da0c4340-a657-43bd-9a98-4c8f50add720] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 875.687074] env[68906]: ERROR nova.compute.manager [instance: da0c4340-a657-43bd-9a98-4c8f50add720] self.f(*self.args, **self.kw) [ 875.687074] env[68906]: ERROR nova.compute.manager [instance: da0c4340-a657-43bd-9a98-4c8f50add720] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 875.687074] env[68906]: ERROR nova.compute.manager [instance: da0c4340-a657-43bd-9a98-4c8f50add720] raise exceptions.translate_fault(task_info.error) [ 875.687074] env[68906]: ERROR nova.compute.manager [instance: da0c4340-a657-43bd-9a98-4c8f50add720] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 875.687074] env[68906]: ERROR nova.compute.manager [instance: da0c4340-a657-43bd-9a98-4c8f50add720] Faults: ['InvalidArgument'] [ 875.687074] env[68906]: ERROR nova.compute.manager [instance: da0c4340-a657-43bd-9a98-4c8f50add720] [ 875.687074] env[68906]: INFO nova.compute.manager [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] [instance: da0c4340-a657-43bd-9a98-4c8f50add720] Terminating instance [ 875.687624] env[68906]: DEBUG oslo_concurrency.lockutils [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 875.688044] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 875.688779] env[68906]: DEBUG nova.compute.manager [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] [instance: da0c4340-a657-43bd-9a98-4c8f50add720] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 875.688993] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] [instance: da0c4340-a657-43bd-9a98-4c8f50add720] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 875.689246] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d84309c3-9738-49eb-abda-26b1b43ea8dc {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 875.691766] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba813f8e-ef58-4519-b37e-02b9d16b9e6c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 875.700318] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] [instance: da0c4340-a657-43bd-9a98-4c8f50add720] Unregistering the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 875.700916] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-f62a49ab-efe8-418d-acef-d515ea851861 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 875.702672] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 875.702672] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68906) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 875.703399] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b28106b7-1321-44ff-b503-b0133befb9a0 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 875.708869] env[68906]: DEBUG oslo_vmware.api [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Waiting for the task: (returnval){ [ 875.708869] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]5283024b-a106-a326-b840-f5daefd71923" [ 875.708869] env[68906]: _type = "Task" [ 875.708869] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 875.717424] env[68906]: DEBUG oslo_vmware.api [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]5283024b-a106-a326-b840-f5daefd71923, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 875.781025] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] [instance: da0c4340-a657-43bd-9a98-4c8f50add720] Unregistered the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 875.781025] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] [instance: da0c4340-a657-43bd-9a98-4c8f50add720] Deleting contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 875.781025] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Deleting the datastore file [datastore2] da0c4340-a657-43bd-9a98-4c8f50add720 {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 875.781025] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-05e0749b-3f51-4d52-99f9-46104c69a0df {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 875.790133] env[68906]: DEBUG oslo_vmware.api [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Waiting for the task: (returnval){ [ 875.790133] env[68906]: value = "task-3475317" [ 875.790133] env[68906]: _type = "Task" [ 875.790133] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 875.799311] env[68906]: DEBUG oslo_vmware.api [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Task: {'id': task-3475317, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 876.135851] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 876.140640] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 876.224201] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] Preparing fetch location {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 876.224449] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Creating directory with path [datastore2] vmware_temp/1996884e-9fc7-431a-916e-3800ba26d166/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 876.224711] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f448746f-e2cd-4f68-bb0f-dee47f00baa3 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 876.237222] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Created directory with path [datastore2] vmware_temp/1996884e-9fc7-431a-916e-3800ba26d166/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 876.238489] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] Fetch image to [datastore2] vmware_temp/1996884e-9fc7-431a-916e-3800ba26d166/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 876.238708] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to [datastore2] vmware_temp/1996884e-9fc7-431a-916e-3800ba26d166/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 876.239596] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d6a34729-e5db-4238-93f3-6720c48253bd {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 876.248808] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b153c8a2-508c-44c1-932c-c493da269f88 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 876.265535] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c265cbb-bf56-4944-8553-ee5e5063bc93 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 876.318053] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94a16636-fc7d-4005-ba17-c107fc7fc9fe {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 876.326504] env[68906]: DEBUG oslo_vmware.api [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Task: {'id': task-3475317, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.067626} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 876.329034] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Deleted the datastore file {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 876.329034] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] [instance: da0c4340-a657-43bd-9a98-4c8f50add720] Deleted contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 876.329034] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] [instance: da0c4340-a657-43bd-9a98-4c8f50add720] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 876.329034] env[68906]: INFO nova.compute.manager [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] [instance: da0c4340-a657-43bd-9a98-4c8f50add720] Took 0.64 seconds to destroy the instance on the hypervisor. [ 876.330849] env[68906]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-f4dd47cf-cb67-4d15-a5a3-9e7ac202d095 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 876.332997] env[68906]: DEBUG nova.compute.claims [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] [instance: da0c4340-a657-43bd-9a98-4c8f50add720] Aborting claim: {{(pid=68906) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 876.333614] env[68906]: DEBUG oslo_concurrency.lockutils [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 876.333614] env[68906]: DEBUG oslo_concurrency.lockutils [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 876.365601] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 876.456212] env[68906]: DEBUG oslo_vmware.rw_handles [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1996884e-9fc7-431a-916e-3800ba26d166/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 876.535943] env[68906]: DEBUG oslo_vmware.rw_handles [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Completed reading data from the image iterator. {{(pid=68906) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 876.536221] env[68906]: DEBUG oslo_vmware.rw_handles [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1996884e-9fc7-431a-916e-3800ba26d166/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 876.575480] env[68906]: DEBUG nova.scheduler.client.report [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Refreshing inventories for resource provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 876.576735] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fe2cd7df-ed0e-48a4-b92a-bb5f48ec790c tempest-InstanceActionsV221TestJSON-740479432 tempest-InstanceActionsV221TestJSON-740479432-project-member] Acquiring lock "ad955cdc-85f1-4096-b2ec-7635d289ee57" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 876.576924] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fe2cd7df-ed0e-48a4-b92a-bb5f48ec790c tempest-InstanceActionsV221TestJSON-740479432 tempest-InstanceActionsV221TestJSON-740479432-project-member] Lock "ad955cdc-85f1-4096-b2ec-7635d289ee57" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 876.593289] env[68906]: DEBUG nova.scheduler.client.report [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Updating ProviderTree inventory for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 876.593581] env[68906]: DEBUG nova.compute.provider_tree [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Updating inventory in ProviderTree for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 876.609923] env[68906]: DEBUG nova.scheduler.client.report [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Refreshing aggregate associations for resource provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b, aggregates: None {{(pid=68906) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 876.634829] env[68906]: DEBUG nova.scheduler.client.report [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Refreshing trait associations for resource provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b, traits: COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ISO {{(pid=68906) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 876.998084] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c0a30a1-979b-43da-b9ab-1b2bf730d904 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 877.005943] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f29c25f-6680-4c06-93ed-81c3899bfa8b {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 877.039932] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1284dcda-9f17-49f6-a1a3-e008ec6262d8 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 877.048643] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a085718-0f89-44e9-81a8-14649d82e155 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 877.062784] env[68906]: DEBUG nova.compute.provider_tree [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 877.075871] env[68906]: DEBUG nova.scheduler.client.report [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 877.094740] env[68906]: DEBUG oslo_concurrency.lockutils [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.761s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 877.095280] env[68906]: ERROR nova.compute.manager [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] [instance: da0c4340-a657-43bd-9a98-4c8f50add720] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 877.095280] env[68906]: Faults: ['InvalidArgument'] [ 877.095280] env[68906]: ERROR nova.compute.manager [instance: da0c4340-a657-43bd-9a98-4c8f50add720] Traceback (most recent call last): [ 877.095280] env[68906]: ERROR nova.compute.manager [instance: da0c4340-a657-43bd-9a98-4c8f50add720] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 877.095280] env[68906]: ERROR nova.compute.manager [instance: da0c4340-a657-43bd-9a98-4c8f50add720] self.driver.spawn(context, instance, image_meta, [ 877.095280] env[68906]: ERROR nova.compute.manager [instance: da0c4340-a657-43bd-9a98-4c8f50add720] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 877.095280] env[68906]: ERROR nova.compute.manager [instance: da0c4340-a657-43bd-9a98-4c8f50add720] self._vmops.spawn(context, instance, image_meta, injected_files, [ 877.095280] env[68906]: ERROR nova.compute.manager [instance: da0c4340-a657-43bd-9a98-4c8f50add720] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 877.095280] env[68906]: ERROR nova.compute.manager [instance: da0c4340-a657-43bd-9a98-4c8f50add720] self._fetch_image_if_missing(context, vi) [ 877.095280] env[68906]: ERROR nova.compute.manager [instance: da0c4340-a657-43bd-9a98-4c8f50add720] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 877.095280] env[68906]: ERROR nova.compute.manager [instance: da0c4340-a657-43bd-9a98-4c8f50add720] image_cache(vi, tmp_image_ds_loc) [ 877.095280] env[68906]: ERROR nova.compute.manager [instance: da0c4340-a657-43bd-9a98-4c8f50add720] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 877.095651] env[68906]: ERROR nova.compute.manager [instance: da0c4340-a657-43bd-9a98-4c8f50add720] vm_util.copy_virtual_disk( [ 877.095651] env[68906]: ERROR nova.compute.manager [instance: da0c4340-a657-43bd-9a98-4c8f50add720] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 877.095651] env[68906]: ERROR nova.compute.manager [instance: da0c4340-a657-43bd-9a98-4c8f50add720] session._wait_for_task(vmdk_copy_task) [ 877.095651] env[68906]: ERROR nova.compute.manager [instance: da0c4340-a657-43bd-9a98-4c8f50add720] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 877.095651] env[68906]: ERROR nova.compute.manager [instance: da0c4340-a657-43bd-9a98-4c8f50add720] return self.wait_for_task(task_ref) [ 877.095651] env[68906]: ERROR nova.compute.manager [instance: da0c4340-a657-43bd-9a98-4c8f50add720] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 877.095651] env[68906]: ERROR nova.compute.manager [instance: da0c4340-a657-43bd-9a98-4c8f50add720] return evt.wait() [ 877.095651] env[68906]: ERROR nova.compute.manager [instance: da0c4340-a657-43bd-9a98-4c8f50add720] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 877.095651] env[68906]: ERROR nova.compute.manager [instance: da0c4340-a657-43bd-9a98-4c8f50add720] result = hub.switch() [ 877.095651] env[68906]: ERROR nova.compute.manager [instance: da0c4340-a657-43bd-9a98-4c8f50add720] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 877.095651] env[68906]: ERROR nova.compute.manager [instance: da0c4340-a657-43bd-9a98-4c8f50add720] return self.greenlet.switch() [ 877.095651] env[68906]: ERROR nova.compute.manager [instance: da0c4340-a657-43bd-9a98-4c8f50add720] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 877.095651] env[68906]: ERROR nova.compute.manager [instance: da0c4340-a657-43bd-9a98-4c8f50add720] self.f(*self.args, **self.kw) [ 877.095975] env[68906]: ERROR nova.compute.manager [instance: da0c4340-a657-43bd-9a98-4c8f50add720] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 877.095975] env[68906]: ERROR nova.compute.manager [instance: da0c4340-a657-43bd-9a98-4c8f50add720] raise exceptions.translate_fault(task_info.error) [ 877.095975] env[68906]: ERROR nova.compute.manager [instance: da0c4340-a657-43bd-9a98-4c8f50add720] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 877.095975] env[68906]: ERROR nova.compute.manager [instance: da0c4340-a657-43bd-9a98-4c8f50add720] Faults: ['InvalidArgument'] [ 877.095975] env[68906]: ERROR nova.compute.manager [instance: da0c4340-a657-43bd-9a98-4c8f50add720] [ 877.096126] env[68906]: DEBUG nova.compute.utils [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] [instance: da0c4340-a657-43bd-9a98-4c8f50add720] VimFaultException {{(pid=68906) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 877.098459] env[68906]: DEBUG nova.compute.manager [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] [instance: da0c4340-a657-43bd-9a98-4c8f50add720] Build of instance da0c4340-a657-43bd-9a98-4c8f50add720 was re-scheduled: A specified parameter was not correct: fileType [ 877.098459] env[68906]: Faults: ['InvalidArgument'] {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 877.098828] env[68906]: DEBUG nova.compute.manager [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] [instance: da0c4340-a657-43bd-9a98-4c8f50add720] Unplugging VIFs for instance {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 877.099018] env[68906]: DEBUG nova.compute.manager [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 877.099192] env[68906]: DEBUG nova.compute.manager [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] [instance: da0c4340-a657-43bd-9a98-4c8f50add720] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 877.099358] env[68906]: DEBUG nova.network.neutron [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] [instance: da0c4340-a657-43bd-9a98-4c8f50add720] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 877.141952] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 877.141952] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Starting heal instance info cache {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 877.141952] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Rebuilding the list of instances to heal {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 877.172587] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 877.172746] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 877.172912] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 877.173023] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 877.173696] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 877.173696] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 877.173696] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 877.173696] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 877.173867] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 877.173980] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Didn't find any instances for network info cache update. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 877.725489] env[68906]: DEBUG nova.network.neutron [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] [instance: da0c4340-a657-43bd-9a98-4c8f50add720] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 877.739771] env[68906]: INFO nova.compute.manager [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] [instance: da0c4340-a657-43bd-9a98-4c8f50add720] Took 0.64 seconds to deallocate network for instance. [ 877.867972] env[68906]: INFO nova.scheduler.client.report [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Deleted allocations for instance da0c4340-a657-43bd-9a98-4c8f50add720 [ 877.902104] env[68906]: DEBUG oslo_concurrency.lockutils [None req-3d9af540-455f-4eac-be79-8ed7933ffa43 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Lock "da0c4340-a657-43bd-9a98-4c8f50add720" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 244.772s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 877.903305] env[68906]: DEBUG oslo_concurrency.lockutils [None req-74c68a04-99ad-4af6-a0d0-f24a0942ce19 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Lock "da0c4340-a657-43bd-9a98-4c8f50add720" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 46.950s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 877.903507] env[68906]: DEBUG oslo_concurrency.lockutils [None req-74c68a04-99ad-4af6-a0d0-f24a0942ce19 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Acquiring lock "da0c4340-a657-43bd-9a98-4c8f50add720-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 877.903729] env[68906]: DEBUG oslo_concurrency.lockutils [None req-74c68a04-99ad-4af6-a0d0-f24a0942ce19 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Lock "da0c4340-a657-43bd-9a98-4c8f50add720-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 877.903937] env[68906]: DEBUG oslo_concurrency.lockutils [None req-74c68a04-99ad-4af6-a0d0-f24a0942ce19 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Lock "da0c4340-a657-43bd-9a98-4c8f50add720-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 877.908681] env[68906]: INFO nova.compute.manager [None req-74c68a04-99ad-4af6-a0d0-f24a0942ce19 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] [instance: da0c4340-a657-43bd-9a98-4c8f50add720] Terminating instance [ 877.913100] env[68906]: DEBUG nova.compute.manager [None req-74c68a04-99ad-4af6-a0d0-f24a0942ce19 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] [instance: da0c4340-a657-43bd-9a98-4c8f50add720] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 877.913212] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-74c68a04-99ad-4af6-a0d0-f24a0942ce19 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] [instance: da0c4340-a657-43bd-9a98-4c8f50add720] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 877.913759] env[68906]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-3f62de1a-2650-4891-aee2-04268a850c85 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 877.920859] env[68906]: DEBUG nova.compute.manager [None req-33e1c89b-3692-4ec9-abc0-4ab4dc3be0f7 tempest-ServersWithSpecificFlavorTestJSON-1201791476 tempest-ServersWithSpecificFlavorTestJSON-1201791476-project-member] [instance: 12724be5-cfb1-4cf6-b98b-b4142da21714] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 877.927390] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b748806-8381-4f16-a6e7-d0e63e8e579a {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 877.963155] env[68906]: WARNING nova.virt.vmwareapi.vmops [None req-74c68a04-99ad-4af6-a0d0-f24a0942ce19 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] [instance: da0c4340-a657-43bd-9a98-4c8f50add720] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance da0c4340-a657-43bd-9a98-4c8f50add720 could not be found. [ 877.963438] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-74c68a04-99ad-4af6-a0d0-f24a0942ce19 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] [instance: da0c4340-a657-43bd-9a98-4c8f50add720] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 877.964091] env[68906]: INFO nova.compute.manager [None req-74c68a04-99ad-4af6-a0d0-f24a0942ce19 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] [instance: da0c4340-a657-43bd-9a98-4c8f50add720] Took 0.05 seconds to destroy the instance on the hypervisor. [ 877.964319] env[68906]: DEBUG oslo.service.loopingcall [None req-74c68a04-99ad-4af6-a0d0-f24a0942ce19 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 877.964561] env[68906]: DEBUG nova.compute.manager [-] [instance: da0c4340-a657-43bd-9a98-4c8f50add720] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 877.964661] env[68906]: DEBUG nova.network.neutron [-] [instance: da0c4340-a657-43bd-9a98-4c8f50add720] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 877.968689] env[68906]: DEBUG nova.compute.manager [None req-33e1c89b-3692-4ec9-abc0-4ab4dc3be0f7 tempest-ServersWithSpecificFlavorTestJSON-1201791476 tempest-ServersWithSpecificFlavorTestJSON-1201791476-project-member] [instance: 12724be5-cfb1-4cf6-b98b-b4142da21714] Instance disappeared before build. {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 878.006553] env[68906]: DEBUG oslo_concurrency.lockutils [None req-33e1c89b-3692-4ec9-abc0-4ab4dc3be0f7 tempest-ServersWithSpecificFlavorTestJSON-1201791476 tempest-ServersWithSpecificFlavorTestJSON-1201791476-project-member] Lock "12724be5-cfb1-4cf6-b98b-b4142da21714" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 218.923s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 878.019889] env[68906]: DEBUG nova.compute.manager [None req-5cbdaf50-2b7b-401c-a522-7be104ee2090 tempest-ServersTestMultiNic-1243959320 tempest-ServersTestMultiNic-1243959320-project-member] [instance: b3bd0ecb-f329-48f3-b48b-25751262a5eb] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 878.054656] env[68906]: DEBUG nova.network.neutron [-] [instance: da0c4340-a657-43bd-9a98-4c8f50add720] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 878.054656] env[68906]: DEBUG nova.compute.manager [None req-5cbdaf50-2b7b-401c-a522-7be104ee2090 tempest-ServersTestMultiNic-1243959320 tempest-ServersTestMultiNic-1243959320-project-member] [instance: b3bd0ecb-f329-48f3-b48b-25751262a5eb] Instance disappeared before build. {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 878.067841] env[68906]: INFO nova.compute.manager [-] [instance: da0c4340-a657-43bd-9a98-4c8f50add720] Took 0.10 seconds to deallocate network for instance. [ 878.100337] env[68906]: DEBUG oslo_concurrency.lockutils [None req-5cbdaf50-2b7b-401c-a522-7be104ee2090 tempest-ServersTestMultiNic-1243959320 tempest-ServersTestMultiNic-1243959320-project-member] Lock "b3bd0ecb-f329-48f3-b48b-25751262a5eb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 214.550s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 878.144028] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 878.144028] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 878.144028] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 878.144028] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68906) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 878.148496] env[68906]: DEBUG nova.compute.manager [None req-76e436a4-02de-4d9d-8c6d-05a0643fcf58 tempest-ServersTestFqdnHostnames-1135812067 tempest-ServersTestFqdnHostnames-1135812067-project-member] [instance: e63fba5c-46fd-494c-9aec-dd76f12974d7] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 878.203724] env[68906]: DEBUG nova.compute.manager [None req-76e436a4-02de-4d9d-8c6d-05a0643fcf58 tempest-ServersTestFqdnHostnames-1135812067 tempest-ServersTestFqdnHostnames-1135812067-project-member] [instance: e63fba5c-46fd-494c-9aec-dd76f12974d7] Instance disappeared before build. {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 878.255813] env[68906]: DEBUG oslo_concurrency.lockutils [None req-76e436a4-02de-4d9d-8c6d-05a0643fcf58 tempest-ServersTestFqdnHostnames-1135812067 tempest-ServersTestFqdnHostnames-1135812067-project-member] Lock "e63fba5c-46fd-494c-9aec-dd76f12974d7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 213.995s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 878.272227] env[68906]: DEBUG nova.compute.manager [None req-788bd7dd-0cd5-4bb7-8cd1-04d3a7d25d53 tempest-ListServerFiltersTestJSON-1324676598 tempest-ListServerFiltersTestJSON-1324676598-project-member] [instance: c02f41e2-8a99-4f18-9d86-82fa702bb2b6] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 878.315688] env[68906]: DEBUG nova.compute.manager [None req-788bd7dd-0cd5-4bb7-8cd1-04d3a7d25d53 tempest-ListServerFiltersTestJSON-1324676598 tempest-ListServerFiltersTestJSON-1324676598-project-member] [instance: c02f41e2-8a99-4f18-9d86-82fa702bb2b6] Instance disappeared before build. {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 878.328654] env[68906]: DEBUG oslo_concurrency.lockutils [None req-74c68a04-99ad-4af6-a0d0-f24a0942ce19 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Lock "da0c4340-a657-43bd-9a98-4c8f50add720" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.425s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 878.344601] env[68906]: DEBUG oslo_concurrency.lockutils [None req-788bd7dd-0cd5-4bb7-8cd1-04d3a7d25d53 tempest-ListServerFiltersTestJSON-1324676598 tempest-ListServerFiltersTestJSON-1324676598-project-member] Lock "c02f41e2-8a99-4f18-9d86-82fa702bb2b6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 214.083s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 878.355539] env[68906]: DEBUG nova.compute.manager [None req-d4746353-557b-4464-928b-810736c8a5e8 tempest-ServersNegativeTestJSON-345353645 tempest-ServersNegativeTestJSON-345353645-project-member] [instance: a2414623-7871-4706-81db-7d15ca74fdab] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 878.384355] env[68906]: DEBUG nova.compute.manager [None req-d4746353-557b-4464-928b-810736c8a5e8 tempest-ServersNegativeTestJSON-345353645 tempest-ServersNegativeTestJSON-345353645-project-member] [instance: a2414623-7871-4706-81db-7d15ca74fdab] Instance disappeared before build. {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 878.407741] env[68906]: DEBUG oslo_concurrency.lockutils [None req-d4746353-557b-4464-928b-810736c8a5e8 tempest-ServersNegativeTestJSON-345353645 tempest-ServersNegativeTestJSON-345353645-project-member] Lock "a2414623-7871-4706-81db-7d15ca74fdab" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 212.871s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 878.422118] env[68906]: DEBUG nova.compute.manager [None req-cfcc6ec4-506b-49d6-b35d-2b745b5478c6 tempest-ServerShowV247Test-1864688297 tempest-ServerShowV247Test-1864688297-project-member] [instance: 252028a3-3d3e-44c5-9c51-26752962a90d] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 878.451388] env[68906]: DEBUG nova.compute.manager [None req-cfcc6ec4-506b-49d6-b35d-2b745b5478c6 tempest-ServerShowV247Test-1864688297 tempest-ServerShowV247Test-1864688297-project-member] [instance: 252028a3-3d3e-44c5-9c51-26752962a90d] Instance disappeared before build. {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 878.476232] env[68906]: DEBUG oslo_concurrency.lockutils [None req-cfcc6ec4-506b-49d6-b35d-2b745b5478c6 tempest-ServerShowV247Test-1864688297 tempest-ServerShowV247Test-1864688297-project-member] Lock "252028a3-3d3e-44c5-9c51-26752962a90d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 212.856s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 878.490818] env[68906]: DEBUG nova.compute.manager [None req-6ad0b13c-1d88-4d71-b729-8d7af234ed8c tempest-ListServerFiltersTestJSON-1324676598 tempest-ListServerFiltersTestJSON-1324676598-project-member] [instance: 5ebd4d05-ddb3-4001-a526-a0c96b081818] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 878.520045] env[68906]: DEBUG nova.compute.manager [None req-6ad0b13c-1d88-4d71-b729-8d7af234ed8c tempest-ListServerFiltersTestJSON-1324676598 tempest-ListServerFiltersTestJSON-1324676598-project-member] [instance: 5ebd4d05-ddb3-4001-a526-a0c96b081818] Instance disappeared before build. {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 878.549234] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6ad0b13c-1d88-4d71-b729-8d7af234ed8c tempest-ListServerFiltersTestJSON-1324676598 tempest-ListServerFiltersTestJSON-1324676598-project-member] Lock "5ebd4d05-ddb3-4001-a526-a0c96b081818" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 212.203s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 878.562874] env[68906]: DEBUG nova.compute.manager [None req-6779e842-3485-4ec7-8ae6-ff66c04d0527 tempest-ServerDiagnosticsTest-1865925156 tempest-ServerDiagnosticsTest-1865925156-project-member] [instance: c2e2265b-aef3-4a8c-ae03-314e679af64b] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 878.596085] env[68906]: DEBUG nova.compute.manager [None req-6779e842-3485-4ec7-8ae6-ff66c04d0527 tempest-ServerDiagnosticsTest-1865925156 tempest-ServerDiagnosticsTest-1865925156-project-member] [instance: c2e2265b-aef3-4a8c-ae03-314e679af64b] Instance disappeared before build. {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 878.626183] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6779e842-3485-4ec7-8ae6-ff66c04d0527 tempest-ServerDiagnosticsTest-1865925156 tempest-ServerDiagnosticsTest-1865925156-project-member] Lock "c2e2265b-aef3-4a8c-ae03-314e679af64b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 211.917s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 878.641327] env[68906]: DEBUG nova.compute.manager [None req-bcd4eaa1-350f-4523-a418-b08be646decd tempest-ListServerFiltersTestJSON-1324676598 tempest-ListServerFiltersTestJSON-1324676598-project-member] [instance: 3ce9d4bd-3d7a-4191-9d3e-892efc81e8a2] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 878.673628] env[68906]: DEBUG nova.compute.manager [None req-bcd4eaa1-350f-4523-a418-b08be646decd tempest-ListServerFiltersTestJSON-1324676598 tempest-ListServerFiltersTestJSON-1324676598-project-member] [instance: 3ce9d4bd-3d7a-4191-9d3e-892efc81e8a2] Instance disappeared before build. {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 878.705884] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bcd4eaa1-350f-4523-a418-b08be646decd tempest-ListServerFiltersTestJSON-1324676598 tempest-ListServerFiltersTestJSON-1324676598-project-member] Lock "3ce9d4bd-3d7a-4191-9d3e-892efc81e8a2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 210.504s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 878.719812] env[68906]: DEBUG nova.compute.manager [None req-3732deea-6ab7-4cd8-9207-35e7d046cba4 tempest-ServerShowV247Test-1864688297 tempest-ServerShowV247Test-1864688297-project-member] [instance: 3f26342e-89a8-4218-8875-8411eb8b16a0] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 878.749320] env[68906]: DEBUG nova.compute.manager [None req-3732deea-6ab7-4cd8-9207-35e7d046cba4 tempest-ServerShowV247Test-1864688297 tempest-ServerShowV247Test-1864688297-project-member] [instance: 3f26342e-89a8-4218-8875-8411eb8b16a0] Instance disappeared before build. {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 878.777594] env[68906]: DEBUG oslo_concurrency.lockutils [None req-3732deea-6ab7-4cd8-9207-35e7d046cba4 tempest-ServerShowV247Test-1864688297 tempest-ServerShowV247Test-1864688297-project-member] Lock "3f26342e-89a8-4218-8875-8411eb8b16a0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 209.741s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 878.789695] env[68906]: DEBUG nova.compute.manager [None req-2ea042cd-dbdb-4f1c-ad8b-7f6455e1ae47 tempest-ServersTestBootFromVolume-70355936 tempest-ServersTestBootFromVolume-70355936-project-member] [instance: 1fb9796e-e0d4-410d-bff1-a6a44b2a3580] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 878.820433] env[68906]: DEBUG nova.compute.manager [None req-2ea042cd-dbdb-4f1c-ad8b-7f6455e1ae47 tempest-ServersTestBootFromVolume-70355936 tempest-ServersTestBootFromVolume-70355936-project-member] [instance: 1fb9796e-e0d4-410d-bff1-a6a44b2a3580] Instance disappeared before build. {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 878.842989] env[68906]: DEBUG oslo_concurrency.lockutils [None req-2ea042cd-dbdb-4f1c-ad8b-7f6455e1ae47 tempest-ServersTestBootFromVolume-70355936 tempest-ServersTestBootFromVolume-70355936-project-member] Lock "1fb9796e-e0d4-410d-bff1-a6a44b2a3580" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 208.256s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 878.856796] env[68906]: DEBUG nova.compute.manager [None req-d75acd4f-807f-4ecd-9ae2-747ee1c4928b tempest-ServerAddressesNegativeTestJSON-1929406516 tempest-ServerAddressesNegativeTestJSON-1929406516-project-member] [instance: b77ff68e-350b-4f65-bb62-dfb727281e50] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 878.887479] env[68906]: DEBUG nova.compute.manager [None req-d75acd4f-807f-4ecd-9ae2-747ee1c4928b tempest-ServerAddressesNegativeTestJSON-1929406516 tempest-ServerAddressesNegativeTestJSON-1929406516-project-member] [instance: b77ff68e-350b-4f65-bb62-dfb727281e50] Instance disappeared before build. {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 878.921511] env[68906]: DEBUG oslo_concurrency.lockutils [None req-d75acd4f-807f-4ecd-9ae2-747ee1c4928b tempest-ServerAddressesNegativeTestJSON-1929406516 tempest-ServerAddressesNegativeTestJSON-1929406516-project-member] Lock "b77ff68e-350b-4f65-bb62-dfb727281e50" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 207.261s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 878.934809] env[68906]: DEBUG nova.compute.manager [None req-86da2890-c2a9-4c1a-8bb4-8828668edefb tempest-InstanceActionsTestJSON-874396935 tempest-InstanceActionsTestJSON-874396935-project-member] [instance: 29e5aa99-4e20-4b6f-a749-544b8c41a713] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 878.965566] env[68906]: DEBUG nova.compute.manager [None req-86da2890-c2a9-4c1a-8bb4-8828668edefb tempest-InstanceActionsTestJSON-874396935 tempest-InstanceActionsTestJSON-874396935-project-member] [instance: 29e5aa99-4e20-4b6f-a749-544b8c41a713] Instance disappeared before build. {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 878.994805] env[68906]: DEBUG oslo_concurrency.lockutils [None req-86da2890-c2a9-4c1a-8bb4-8828668edefb tempest-InstanceActionsTestJSON-874396935 tempest-InstanceActionsTestJSON-874396935-project-member] Lock "29e5aa99-4e20-4b6f-a749-544b8c41a713" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 205.758s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 879.009470] env[68906]: DEBUG nova.compute.manager [None req-a0d2b26c-2e2f-42a2-b855-68741d9ae4e1 tempest-ServersAaction247Test-1755435843 tempest-ServersAaction247Test-1755435843-project-member] [instance: 20fa65c1-9ea0-4dc2-828e-8477c9f45baa] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 879.037301] env[68906]: DEBUG nova.compute.manager [None req-a0d2b26c-2e2f-42a2-b855-68741d9ae4e1 tempest-ServersAaction247Test-1755435843 tempest-ServersAaction247Test-1755435843-project-member] [instance: 20fa65c1-9ea0-4dc2-828e-8477c9f45baa] Instance disappeared before build. {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 879.072325] env[68906]: DEBUG oslo_concurrency.lockutils [None req-a0d2b26c-2e2f-42a2-b855-68741d9ae4e1 tempest-ServersAaction247Test-1755435843 tempest-ServersAaction247Test-1755435843-project-member] Lock "20fa65c1-9ea0-4dc2-828e-8477c9f45baa" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 205.355s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 879.090032] env[68906]: DEBUG nova.compute.manager [None req-0325c848-7458-4b40-a533-6b073c138188 tempest-FloatingIPsAssociationNegativeTestJSON-883500755 tempest-FloatingIPsAssociationNegativeTestJSON-883500755-project-member] [instance: 98844da1-0e2a-46b5-8e72-c0f8dcd29b27] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 879.126468] env[68906]: DEBUG nova.compute.manager [None req-0325c848-7458-4b40-a533-6b073c138188 tempest-FloatingIPsAssociationNegativeTestJSON-883500755 tempest-FloatingIPsAssociationNegativeTestJSON-883500755-project-member] [instance: 98844da1-0e2a-46b5-8e72-c0f8dcd29b27] Instance disappeared before build. {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 879.159431] env[68906]: DEBUG oslo_concurrency.lockutils [None req-0325c848-7458-4b40-a533-6b073c138188 tempest-FloatingIPsAssociationNegativeTestJSON-883500755 tempest-FloatingIPsAssociationNegativeTestJSON-883500755-project-member] Lock "98844da1-0e2a-46b5-8e72-c0f8dcd29b27" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 197.843s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 879.175940] env[68906]: DEBUG nova.compute.manager [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 879.261835] env[68906]: DEBUG oslo_concurrency.lockutils [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 879.262201] env[68906]: DEBUG oslo_concurrency.lockutils [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 879.263952] env[68906]: INFO nova.compute.claims [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 879.619112] env[68906]: DEBUG oslo_concurrency.lockutils [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Acquiring lock "9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 879.619393] env[68906]: DEBUG oslo_concurrency.lockutils [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Lock "9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 879.685325] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a3a3178-9a01-4139-8ff7-9d2fd86424a9 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 879.694525] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-185410d0-104d-4c8a-a733-dca199b80a08 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 879.727837] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2467c7aa-8b8a-424d-8e42-5af3f2e77a01 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 879.736434] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8cfb23e4-c491-434e-983c-9e375e88e26c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 879.750721] env[68906]: DEBUG nova.compute.provider_tree [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 879.762856] env[68906]: DEBUG nova.scheduler.client.report [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 879.784020] env[68906]: DEBUG oslo_concurrency.lockutils [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.519s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 879.784020] env[68906]: DEBUG nova.compute.manager [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Start building networks asynchronously for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 879.832866] env[68906]: DEBUG nova.compute.utils [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Using /dev/sd instead of None {{(pid=68906) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 879.833158] env[68906]: DEBUG nova.compute.manager [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Allocating IP information in the background. {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 879.833843] env[68906]: DEBUG nova.network.neutron [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] allocate_for_instance() {{(pid=68906) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 879.850815] env[68906]: DEBUG nova.compute.manager [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Start building block device mappings for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 879.928293] env[68906]: DEBUG nova.compute.manager [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Start spawning the instance on the hypervisor. {{(pid=68906) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 879.958959] env[68906]: DEBUG nova.virt.hardware [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T13:00:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T13:00:23Z,direct_url=,disk_format='vmdk',id=b1400c31-d33b-4e13-944f-4c645e62493e,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='1ae7bf3a375d41c6af5e7536af51ffd1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T13:00:24Z,virtual_size=,visibility=), allow threads: False {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 879.958959] env[68906]: DEBUG nova.virt.hardware [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Flavor limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 879.958959] env[68906]: DEBUG nova.virt.hardware [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Image limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 879.959261] env[68906]: DEBUG nova.virt.hardware [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Flavor pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 879.959261] env[68906]: DEBUG nova.virt.hardware [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Image pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 879.959365] env[68906]: DEBUG nova.virt.hardware [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 879.959519] env[68906]: DEBUG nova.virt.hardware [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 879.959813] env[68906]: DEBUG nova.virt.hardware [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 879.959919] env[68906]: DEBUG nova.virt.hardware [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Got 1 possible topologies {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 879.959997] env[68906]: DEBUG nova.virt.hardware [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 879.960184] env[68906]: DEBUG nova.virt.hardware [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 879.961066] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-03bf85bf-3348-4a6a-996f-14b7b3c7256d {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 879.973287] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2a763352-4a5f-4748-ab9e-4095e54d3a53 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 880.007153] env[68906]: DEBUG nova.policy [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'af4fb26f580248dbbde3c1ff705f80be', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a820177d7fb74473805519c9ec4b26aa', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68906) authorize /opt/stack/nova/nova/policy.py:203}} [ 880.141349] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager.update_available_resource {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 880.156018] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 880.156332] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 880.156404] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 880.156567] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68906) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 880.158430] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-46f1cdfa-68be-4255-b903-e7a0306deaaf {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 880.172547] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-025a2291-e919-47cb-b0f9-a0e4b6112c99 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 880.187281] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fff6e5cc-9f75-48d7-93c4-e434a517ac63 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 880.196147] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc9e0c9e-6c07-4354-a0b0-de65c5a6674a {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 880.236171] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180936MB free_disk=93GB free_vcpus=48 pci_devices=None {{(pid=68906) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 880.236171] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 880.236171] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 880.314994] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 0540a4dc-1b86-4776-b633-f540af168a2b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 880.315184] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 4edb8b9f-b608-4be8-bfd3-65642710f9bd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 880.315313] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance d6ca51b9-b284-405c-878e-fdbc326b73e1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 880.315431] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance f42056e5-52cb-4d69-8022-ca643c49194e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 880.315545] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance ce63789a-1f0f-40ca-8368-ac3f84bb58cd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 880.315657] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 13eebe4e-5984-46c3-bb73-cd783ad45df6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 880.315766] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 9a2d2803-34b1-40f7-9349-e5734a217e18 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 880.315896] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance a7e0a28f-42a5-442e-b962-07771d2e6a27 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 880.316039] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance eb81e9b1-b573-4d7c-9ede-f8b32a43a201 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 880.316200] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 880.337930] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 653c016d-c596-4f45-a18e-55f2d1935166 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 880.349665] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 627c0227-72ca-4a77-aca1-bc3112955e7a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 880.364044] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 03e8dff3-b6b8-4754-8725-dddc9f9e6216 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 880.379128] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance d3c5fdf4-a775-4b88-9bc2-ce9f31a9e6ac has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 880.392662] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 242433e2-5b59-4b19-ba8d-80432ee4b7b7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 880.403954] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance acc11633-a489-4d8f-ad76-f17049a91545 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 880.419170] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 0874bf05-e156-404e-a067-869e370fd14b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 880.437668] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance e7286888-d79d-4632-9c06-69c1ef47fa50 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 880.452947] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 641cca5b-d749-4331-a5e0-8acb6d47cba2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 880.465963] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 6d7b4648-a12f-4c3c-8465-b8fb37eb0d3c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 880.486245] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance ad955cdc-85f1-4096-b2ec-7635d289ee57 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 880.506481] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 880.506747] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 880.506912] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 880.525403] env[68906]: DEBUG oslo_concurrency.lockutils [None req-2fcfd0ef-6d38-4dd9-8a43-b17729caedcf tempest-ServerRescueNegativeTestJSON-608372629 tempest-ServerRescueNegativeTestJSON-608372629-project-member] Acquiring lock "a37ef3ce-1c29-48fe-b9c6-023da5b3db71" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 880.525634] env[68906]: DEBUG oslo_concurrency.lockutils [None req-2fcfd0ef-6d38-4dd9-8a43-b17729caedcf tempest-ServerRescueNegativeTestJSON-608372629 tempest-ServerRescueNegativeTestJSON-608372629-project-member] Lock "a37ef3ce-1c29-48fe-b9c6-023da5b3db71" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 880.893436] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e5300193-5e51-4923-a635-9af6c33d8ef4 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 880.903252] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9d8249b-f4e8-4ff9-9057-5e3fea756b1d {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 880.944554] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-313a6e70-0072-482c-8d89-56f0e07a5511 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 880.952430] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-97fa9cc6-96e4-402a-a732-500e9de7cbcb {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 880.967930] env[68906]: DEBUG nova.compute.provider_tree [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 880.973386] env[68906]: DEBUG nova.network.neutron [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Successfully created port: d578f995-70b2-4400-8144-02bbbcf3d129 {{(pid=68906) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 880.991924] env[68906]: DEBUG nova.scheduler.client.report [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 881.009566] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68906) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 881.009780] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.776s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 883.010116] env[68906]: DEBUG nova.network.neutron [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Successfully updated port: d578f995-70b2-4400-8144-02bbbcf3d129 {{(pid=68906) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 883.022509] env[68906]: DEBUG oslo_concurrency.lockutils [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Acquiring lock "refresh_cache-91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 883.022673] env[68906]: DEBUG oslo_concurrency.lockutils [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Acquired lock "refresh_cache-91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 883.022960] env[68906]: DEBUG nova.network.neutron [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Building network info cache for instance {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 883.108740] env[68906]: DEBUG nova.network.neutron [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Instance cache missing network info. {{(pid=68906) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 883.603871] env[68906]: DEBUG nova.network.neutron [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Updating instance_info_cache with network_info: [{"id": "d578f995-70b2-4400-8144-02bbbcf3d129", "address": "fa:16:3e:f3:46:58", "network": {"id": "8b90a26d-840b-46e8-bf52-1e31d9281703", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-28703609-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a820177d7fb74473805519c9ec4b26aa", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "30f1dacf-8988-41b8-aa8f-e9530f65ef46", "external-id": "nsx-vlan-transportzone-561", "segmentation_id": 561, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd578f995-70", "ovs_interfaceid": "d578f995-70b2-4400-8144-02bbbcf3d129", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 883.627053] env[68906]: DEBUG oslo_concurrency.lockutils [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Releasing lock "refresh_cache-91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 883.627053] env[68906]: DEBUG nova.compute.manager [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Instance network_info: |[{"id": "d578f995-70b2-4400-8144-02bbbcf3d129", "address": "fa:16:3e:f3:46:58", "network": {"id": "8b90a26d-840b-46e8-bf52-1e31d9281703", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-28703609-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a820177d7fb74473805519c9ec4b26aa", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "30f1dacf-8988-41b8-aa8f-e9530f65ef46", "external-id": "nsx-vlan-transportzone-561", "segmentation_id": 561, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd578f995-70", "ovs_interfaceid": "d578f995-70b2-4400-8144-02bbbcf3d129", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 883.627225] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:f3:46:58', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '30f1dacf-8988-41b8-aa8f-e9530f65ef46', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'd578f995-70b2-4400-8144-02bbbcf3d129', 'vif_model': 'vmxnet3'}] {{(pid=68906) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 883.636870] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Creating folder: Project (a820177d7fb74473805519c9ec4b26aa). Parent ref: group-v694750. {{(pid=68906) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 883.637522] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e1e5f04a-448d-4d27-bc03-4eae3a119a00 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 883.651309] env[68906]: INFO nova.virt.vmwareapi.vm_util [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Created folder: Project (a820177d7fb74473805519c9ec4b26aa) in parent group-v694750. [ 883.651309] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Creating folder: Instances. Parent ref: group-v694795. {{(pid=68906) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 883.651309] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-2b521830-b397-4a63-88eb-f2c079e1ebbb {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 883.660284] env[68906]: INFO nova.virt.vmwareapi.vm_util [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Created folder: Instances in parent group-v694795. [ 883.660638] env[68906]: DEBUG oslo.service.loopingcall [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 883.660756] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Creating VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 883.660957] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-ad890b1e-0ec7-4a97-a2ea-62179cc0932c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 883.687812] env[68906]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 883.687812] env[68906]: value = "task-3475320" [ 883.687812] env[68906]: _type = "Task" [ 883.687812] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 883.697435] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475320, 'name': CreateVM_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 883.798287] env[68906]: DEBUG oslo_concurrency.lockutils [None req-3ef12865-803b-471a-95a5-dc05f12c5571 tempest-ServerRescueNegativeTestJSON-608372629 tempest-ServerRescueNegativeTestJSON-608372629-project-member] Acquiring lock "ee17e223-bec7-4541-8cb2-25e4a6c32b34" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 883.798287] env[68906]: DEBUG oslo_concurrency.lockutils [None req-3ef12865-803b-471a-95a5-dc05f12c5571 tempest-ServerRescueNegativeTestJSON-608372629 tempest-ServerRescueNegativeTestJSON-608372629-project-member] Lock "ee17e223-bec7-4541-8cb2-25e4a6c32b34" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 883.975224] env[68906]: DEBUG nova.compute.manager [req-b4a02669-903c-4884-82c0-49ccc6d74527 req-c7e6445c-90dc-4cab-b1cc-42eaffbb4ed4 service nova] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Received event network-vif-plugged-d578f995-70b2-4400-8144-02bbbcf3d129 {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 883.975224] env[68906]: DEBUG oslo_concurrency.lockutils [req-b4a02669-903c-4884-82c0-49ccc6d74527 req-c7e6445c-90dc-4cab-b1cc-42eaffbb4ed4 service nova] Acquiring lock "91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 883.975224] env[68906]: DEBUG oslo_concurrency.lockutils [req-b4a02669-903c-4884-82c0-49ccc6d74527 req-c7e6445c-90dc-4cab-b1cc-42eaffbb4ed4 service nova] Lock "91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 883.975224] env[68906]: DEBUG oslo_concurrency.lockutils [req-b4a02669-903c-4884-82c0-49ccc6d74527 req-c7e6445c-90dc-4cab-b1cc-42eaffbb4ed4 service nova] Lock "91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 883.975383] env[68906]: DEBUG nova.compute.manager [req-b4a02669-903c-4884-82c0-49ccc6d74527 req-c7e6445c-90dc-4cab-b1cc-42eaffbb4ed4 service nova] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] No waiting events found dispatching network-vif-plugged-d578f995-70b2-4400-8144-02bbbcf3d129 {{(pid=68906) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 883.975829] env[68906]: WARNING nova.compute.manager [req-b4a02669-903c-4884-82c0-49ccc6d74527 req-c7e6445c-90dc-4cab-b1cc-42eaffbb4ed4 service nova] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Received unexpected event network-vif-plugged-d578f995-70b2-4400-8144-02bbbcf3d129 for instance with vm_state building and task_state spawning. [ 884.202937] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475320, 'name': CreateVM_Task, 'duration_secs': 0.319504} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 884.204452] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Created VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 884.205188] env[68906]: DEBUG oslo_concurrency.lockutils [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 884.205369] env[68906]: DEBUG oslo_concurrency.lockutils [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 884.205687] env[68906]: DEBUG oslo_concurrency.lockutils [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 884.205966] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-64fd1b70-87d1-4d53-ad2b-8bf35586212b {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 884.212462] env[68906]: DEBUG oslo_vmware.api [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Waiting for the task: (returnval){ [ 884.212462] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]52e13b4d-1e51-d508-5e5a-6a5864cd9297" [ 884.212462] env[68906]: _type = "Task" [ 884.212462] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 884.222828] env[68906]: DEBUG oslo_vmware.api [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]52e13b4d-1e51-d508-5e5a-6a5864cd9297, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 884.725116] env[68906]: DEBUG oslo_concurrency.lockutils [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 884.725634] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Processing image b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 884.726102] env[68906]: DEBUG oslo_concurrency.lockutils [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 884.938357] env[68906]: DEBUG oslo_concurrency.lockutils [None req-759a674a-a9dc-4cda-86d0-b5ec5eba1b78 tempest-ServerActionsTestOtherB-612778985 tempest-ServerActionsTestOtherB-612778985-project-member] Acquiring lock "56f936b4-680d-40db-84ab-8eb319f6ee83" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 884.939030] env[68906]: DEBUG oslo_concurrency.lockutils [None req-759a674a-a9dc-4cda-86d0-b5ec5eba1b78 tempest-ServerActionsTestOtherB-612778985 tempest-ServerActionsTestOtherB-612778985-project-member] Lock "56f936b4-680d-40db-84ab-8eb319f6ee83" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 884.941232] env[68906]: DEBUG oslo_concurrency.lockutils [None req-5dfc4571-f06c-4c81-9226-1d215bbb2db9 tempest-ServerMetadataNegativeTestJSON-343818070 tempest-ServerMetadataNegativeTestJSON-343818070-project-member] Acquiring lock "3ba4a60f-6c41-4e1e-8928-f1b95b885028" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 884.941475] env[68906]: DEBUG oslo_concurrency.lockutils [None req-5dfc4571-f06c-4c81-9226-1d215bbb2db9 tempest-ServerMetadataNegativeTestJSON-343818070 tempest-ServerMetadataNegativeTestJSON-343818070-project-member] Lock "3ba4a60f-6c41-4e1e-8928-f1b95b885028" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 885.721637] env[68906]: DEBUG oslo_concurrency.lockutils [None req-97a35496-b318-4c52-a18b-dac8228b3518 tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Acquiring lock "91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 886.213492] env[68906]: DEBUG nova.compute.manager [req-f4273fcf-7ebd-4ced-af45-29978b0de87d req-12e5cc52-2620-4e06-a665-9af1b17c9762 service nova] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Received event network-changed-d578f995-70b2-4400-8144-02bbbcf3d129 {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 886.213492] env[68906]: DEBUG nova.compute.manager [req-f4273fcf-7ebd-4ced-af45-29978b0de87d req-12e5cc52-2620-4e06-a665-9af1b17c9762 service nova] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Refreshing instance network info cache due to event network-changed-d578f995-70b2-4400-8144-02bbbcf3d129. {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 886.217021] env[68906]: DEBUG oslo_concurrency.lockutils [req-f4273fcf-7ebd-4ced-af45-29978b0de87d req-12e5cc52-2620-4e06-a665-9af1b17c9762 service nova] Acquiring lock "refresh_cache-91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 886.217021] env[68906]: DEBUG oslo_concurrency.lockutils [req-f4273fcf-7ebd-4ced-af45-29978b0de87d req-12e5cc52-2620-4e06-a665-9af1b17c9762 service nova] Acquired lock "refresh_cache-91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 886.217021] env[68906]: DEBUG nova.network.neutron [req-f4273fcf-7ebd-4ced-af45-29978b0de87d req-12e5cc52-2620-4e06-a665-9af1b17c9762 service nova] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Refreshing network info cache for port d578f995-70b2-4400-8144-02bbbcf3d129 {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 886.250957] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8d8d8524-17bd-4588-8951-e1f296fbac81 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Acquiring lock "faec727e-bd92-4201-aaca-5863208be265" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 886.250957] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8d8d8524-17bd-4588-8951-e1f296fbac81 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Lock "faec727e-bd92-4201-aaca-5863208be265" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 887.005070] env[68906]: DEBUG nova.network.neutron [req-f4273fcf-7ebd-4ced-af45-29978b0de87d req-12e5cc52-2620-4e06-a665-9af1b17c9762 service nova] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Updated VIF entry in instance network info cache for port d578f995-70b2-4400-8144-02bbbcf3d129. {{(pid=68906) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 887.005776] env[68906]: DEBUG nova.network.neutron [req-f4273fcf-7ebd-4ced-af45-29978b0de87d req-12e5cc52-2620-4e06-a665-9af1b17c9762 service nova] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Updating instance_info_cache with network_info: [{"id": "d578f995-70b2-4400-8144-02bbbcf3d129", "address": "fa:16:3e:f3:46:58", "network": {"id": "8b90a26d-840b-46e8-bf52-1e31d9281703", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-28703609-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a820177d7fb74473805519c9ec4b26aa", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "30f1dacf-8988-41b8-aa8f-e9530f65ef46", "external-id": "nsx-vlan-transportzone-561", "segmentation_id": 561, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd578f995-70", "ovs_interfaceid": "d578f995-70b2-4400-8144-02bbbcf3d129", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 887.017726] env[68906]: DEBUG oslo_concurrency.lockutils [req-f4273fcf-7ebd-4ced-af45-29978b0de87d req-12e5cc52-2620-4e06-a665-9af1b17c9762 service nova] Releasing lock "refresh_cache-91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 890.207517] env[68906]: DEBUG oslo_concurrency.lockutils [None req-1475f613-fd4b-4f1d-8fac-623e658f362f tempest-AttachVolumeShelveTestJSON-1059946953 tempest-AttachVolumeShelveTestJSON-1059946953-project-member] Acquiring lock "18a5c392-b836-4d2a-bb77-d4af0b9fdb81" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 890.207892] env[68906]: DEBUG oslo_concurrency.lockutils [None req-1475f613-fd4b-4f1d-8fac-623e658f362f tempest-AttachVolumeShelveTestJSON-1059946953 tempest-AttachVolumeShelveTestJSON-1059946953-project-member] Lock "18a5c392-b836-4d2a-bb77-d4af0b9fdb81" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 897.287799] env[68906]: DEBUG oslo_concurrency.lockutils [None req-237495d6-6000-404f-b4a3-e46a4b8ba4ce tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] Acquiring lock "2e6de0b1-335b-49bd-aa15-3fd4cc4b4e9e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 897.288667] env[68906]: DEBUG oslo_concurrency.lockutils [None req-237495d6-6000-404f-b4a3-e46a4b8ba4ce tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] Lock "2e6de0b1-335b-49bd-aa15-3fd4cc4b4e9e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 897.841365] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8595702b-bc02-4dde-85d1-b5b6b00301b0 tempest-ImagesOneServerTestJSON-2105933643 tempest-ImagesOneServerTestJSON-2105933643-project-member] Acquiring lock "d71bae07-54c1-427b-bfe1-2467369cd80c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 897.841893] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8595702b-bc02-4dde-85d1-b5b6b00301b0 tempest-ImagesOneServerTestJSON-2105933643 tempest-ImagesOneServerTestJSON-2105933643-project-member] Lock "d71bae07-54c1-427b-bfe1-2467369cd80c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 924.943625] env[68906]: DEBUG oslo_concurrency.lockutils [None req-00e83854-94e1-4ab2-8698-d76f53e7ae92 tempest-ServerActionsV293TestJSON-1770613532 tempest-ServerActionsV293TestJSON-1770613532-project-member] Acquiring lock "682f0e61-471f-47fb-98de-02449b17d241" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 924.943625] env[68906]: DEBUG oslo_concurrency.lockutils [None req-00e83854-94e1-4ab2-8698-d76f53e7ae92 tempest-ServerActionsV293TestJSON-1770613532 tempest-ServerActionsV293TestJSON-1770613532-project-member] Lock "682f0e61-471f-47fb-98de-02449b17d241" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 925.712342] env[68906]: WARNING oslo_vmware.rw_handles [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 925.712342] env[68906]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 925.712342] env[68906]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 925.712342] env[68906]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 925.712342] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 925.712342] env[68906]: ERROR oslo_vmware.rw_handles response.begin() [ 925.712342] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 925.712342] env[68906]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 925.712342] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 925.712342] env[68906]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 925.712342] env[68906]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 925.712342] env[68906]: ERROR oslo_vmware.rw_handles [ 925.712801] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] Downloaded image file data b1400c31-d33b-4e13-944f-4c645e62493e to vmware_temp/1996884e-9fc7-431a-916e-3800ba26d166/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 925.714480] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] Caching image {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 925.714729] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Copying Virtual Disk [datastore2] vmware_temp/1996884e-9fc7-431a-916e-3800ba26d166/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk to [datastore2] vmware_temp/1996884e-9fc7-431a-916e-3800ba26d166/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk {{(pid=68906) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 925.715028] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-f6e02609-b28a-4fc3-98db-76689f1b7f63 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 925.722837] env[68906]: DEBUG oslo_vmware.api [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Waiting for the task: (returnval){ [ 925.722837] env[68906]: value = "task-3475331" [ 925.722837] env[68906]: _type = "Task" [ 925.722837] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 925.730860] env[68906]: DEBUG oslo_vmware.api [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Task: {'id': task-3475331, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 926.233552] env[68906]: DEBUG oslo_vmware.exceptions [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Fault InvalidArgument not matched. {{(pid=68906) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 926.233841] env[68906]: DEBUG oslo_concurrency.lockutils [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 926.236575] env[68906]: ERROR nova.compute.manager [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 926.236575] env[68906]: Faults: ['InvalidArgument'] [ 926.236575] env[68906]: ERROR nova.compute.manager [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] Traceback (most recent call last): [ 926.236575] env[68906]: ERROR nova.compute.manager [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 926.236575] env[68906]: ERROR nova.compute.manager [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] yield resources [ 926.236575] env[68906]: ERROR nova.compute.manager [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 926.236575] env[68906]: ERROR nova.compute.manager [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] self.driver.spawn(context, instance, image_meta, [ 926.236575] env[68906]: ERROR nova.compute.manager [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 926.236575] env[68906]: ERROR nova.compute.manager [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 926.236575] env[68906]: ERROR nova.compute.manager [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 926.236575] env[68906]: ERROR nova.compute.manager [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] self._fetch_image_if_missing(context, vi) [ 926.236575] env[68906]: ERROR nova.compute.manager [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 926.238029] env[68906]: ERROR nova.compute.manager [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] image_cache(vi, tmp_image_ds_loc) [ 926.238029] env[68906]: ERROR nova.compute.manager [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 926.238029] env[68906]: ERROR nova.compute.manager [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] vm_util.copy_virtual_disk( [ 926.238029] env[68906]: ERROR nova.compute.manager [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 926.238029] env[68906]: ERROR nova.compute.manager [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] session._wait_for_task(vmdk_copy_task) [ 926.238029] env[68906]: ERROR nova.compute.manager [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 926.238029] env[68906]: ERROR nova.compute.manager [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] return self.wait_for_task(task_ref) [ 926.238029] env[68906]: ERROR nova.compute.manager [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 926.238029] env[68906]: ERROR nova.compute.manager [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] return evt.wait() [ 926.238029] env[68906]: ERROR nova.compute.manager [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 926.238029] env[68906]: ERROR nova.compute.manager [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] result = hub.switch() [ 926.238029] env[68906]: ERROR nova.compute.manager [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 926.238029] env[68906]: ERROR nova.compute.manager [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] return self.greenlet.switch() [ 926.238417] env[68906]: ERROR nova.compute.manager [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 926.238417] env[68906]: ERROR nova.compute.manager [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] self.f(*self.args, **self.kw) [ 926.238417] env[68906]: ERROR nova.compute.manager [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 926.238417] env[68906]: ERROR nova.compute.manager [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] raise exceptions.translate_fault(task_info.error) [ 926.238417] env[68906]: ERROR nova.compute.manager [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 926.238417] env[68906]: ERROR nova.compute.manager [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] Faults: ['InvalidArgument'] [ 926.238417] env[68906]: ERROR nova.compute.manager [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] [ 926.238417] env[68906]: INFO nova.compute.manager [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] Terminating instance [ 926.238417] env[68906]: DEBUG oslo_concurrency.lockutils [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 926.238676] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 926.238676] env[68906]: DEBUG nova.compute.manager [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 926.238676] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 926.238676] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-edd943e3-67b4-4a57-b64d-05b4e06b66e6 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 926.239868] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0aeaa51d-5f76-4bad-8866-3134c6166d3a {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 926.246692] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] Unregistering the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 926.246943] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-324673ab-9edc-436f-b9bb-078a630df521 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 926.249233] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 926.249403] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68906) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 926.250365] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-76039940-53b5-4755-9749-09444656ea96 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 926.255210] env[68906]: DEBUG oslo_vmware.api [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Waiting for the task: (returnval){ [ 926.255210] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]52a53593-cb03-e7ba-51be-b889e5cc2820" [ 926.255210] env[68906]: _type = "Task" [ 926.255210] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 926.262625] env[68906]: DEBUG oslo_vmware.api [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]52a53593-cb03-e7ba-51be-b889e5cc2820, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 926.313193] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] Unregistered the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 926.313514] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] Deleting contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 926.313715] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Deleting the datastore file [datastore2] 0540a4dc-1b86-4776-b633-f540af168a2b {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 926.313997] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-db416a1b-20d7-4de5-a41d-ec4ede98b902 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 926.319908] env[68906]: DEBUG oslo_vmware.api [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Waiting for the task: (returnval){ [ 926.319908] env[68906]: value = "task-3475333" [ 926.319908] env[68906]: _type = "Task" [ 926.319908] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 926.327564] env[68906]: DEBUG oslo_vmware.api [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Task: {'id': task-3475333, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 926.765614] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Preparing fetch location {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 926.765876] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Creating directory with path [datastore2] vmware_temp/4c279939-0d5e-4c87-b814-e9a22431f9d2/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 926.766122] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3b0c6456-4882-4b16-b80b-ec0fe572e4ce {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 926.777368] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Created directory with path [datastore2] vmware_temp/4c279939-0d5e-4c87-b814-e9a22431f9d2/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 926.777673] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Fetch image to [datastore2] vmware_temp/4c279939-0d5e-4c87-b814-e9a22431f9d2/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 926.777792] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to [datastore2] vmware_temp/4c279939-0d5e-4c87-b814-e9a22431f9d2/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 926.778465] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f781825-b20a-418b-833a-5830017f93fa {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 926.784719] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44428f83-75bd-4830-92aa-01f443976a55 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 926.794454] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a73ff386-862f-401d-9240-ec29fe92a038 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 926.827744] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42a056ef-585e-42b1-a81d-1214b1ff0869 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 926.835902] env[68906]: DEBUG oslo_vmware.api [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Task: {'id': task-3475333, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.062216} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 926.836667] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Deleted the datastore file {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 926.836855] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] Deleted contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 926.837036] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 926.837224] env[68906]: INFO nova.compute.manager [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] Took 0.60 seconds to destroy the instance on the hypervisor. [ 926.838967] env[68906]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-73111706-f3fc-4137-821a-86d8e7ff87b8 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 926.840910] env[68906]: DEBUG nova.compute.claims [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] Aborting claim: {{(pid=68906) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 926.841008] env[68906]: DEBUG oslo_concurrency.lockutils [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 926.841230] env[68906]: DEBUG oslo_concurrency.lockutils [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 926.864229] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 926.983929] env[68906]: DEBUG oslo_vmware.rw_handles [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/4c279939-0d5e-4c87-b814-e9a22431f9d2/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 927.047816] env[68906]: DEBUG oslo_vmware.rw_handles [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Completed reading data from the image iterator. {{(pid=68906) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 927.047816] env[68906]: DEBUG oslo_vmware.rw_handles [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/4c279939-0d5e-4c87-b814-e9a22431f9d2/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 927.260587] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cb685325-54aa-4da7-b5c0-4b28e8c47ec0 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 927.268504] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a2d8bef2-f3af-4623-b21e-efb2162258b2 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 927.299536] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c748d08c-d0e2-495d-8ff4-5d4ba54ce96a {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 927.307258] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-45f8f9a1-2f0c-4376-ae0d-b74520cdc304 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 927.320682] env[68906]: DEBUG nova.compute.provider_tree [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 927.329864] env[68906]: DEBUG nova.scheduler.client.report [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 927.344772] env[68906]: DEBUG oslo_concurrency.lockutils [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.503s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 927.345317] env[68906]: ERROR nova.compute.manager [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 927.345317] env[68906]: Faults: ['InvalidArgument'] [ 927.345317] env[68906]: ERROR nova.compute.manager [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] Traceback (most recent call last): [ 927.345317] env[68906]: ERROR nova.compute.manager [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 927.345317] env[68906]: ERROR nova.compute.manager [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] self.driver.spawn(context, instance, image_meta, [ 927.345317] env[68906]: ERROR nova.compute.manager [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 927.345317] env[68906]: ERROR nova.compute.manager [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 927.345317] env[68906]: ERROR nova.compute.manager [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 927.345317] env[68906]: ERROR nova.compute.manager [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] self._fetch_image_if_missing(context, vi) [ 927.345317] env[68906]: ERROR nova.compute.manager [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 927.345317] env[68906]: ERROR nova.compute.manager [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] image_cache(vi, tmp_image_ds_loc) [ 927.345317] env[68906]: ERROR nova.compute.manager [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 927.345766] env[68906]: ERROR nova.compute.manager [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] vm_util.copy_virtual_disk( [ 927.345766] env[68906]: ERROR nova.compute.manager [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 927.345766] env[68906]: ERROR nova.compute.manager [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] session._wait_for_task(vmdk_copy_task) [ 927.345766] env[68906]: ERROR nova.compute.manager [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 927.345766] env[68906]: ERROR nova.compute.manager [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] return self.wait_for_task(task_ref) [ 927.345766] env[68906]: ERROR nova.compute.manager [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 927.345766] env[68906]: ERROR nova.compute.manager [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] return evt.wait() [ 927.345766] env[68906]: ERROR nova.compute.manager [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 927.345766] env[68906]: ERROR nova.compute.manager [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] result = hub.switch() [ 927.345766] env[68906]: ERROR nova.compute.manager [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 927.345766] env[68906]: ERROR nova.compute.manager [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] return self.greenlet.switch() [ 927.345766] env[68906]: ERROR nova.compute.manager [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 927.345766] env[68906]: ERROR nova.compute.manager [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] self.f(*self.args, **self.kw) [ 927.346248] env[68906]: ERROR nova.compute.manager [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 927.346248] env[68906]: ERROR nova.compute.manager [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] raise exceptions.translate_fault(task_info.error) [ 927.346248] env[68906]: ERROR nova.compute.manager [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 927.346248] env[68906]: ERROR nova.compute.manager [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] Faults: ['InvalidArgument'] [ 927.346248] env[68906]: ERROR nova.compute.manager [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] [ 927.346248] env[68906]: DEBUG nova.compute.utils [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] VimFaultException {{(pid=68906) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 927.347421] env[68906]: DEBUG nova.compute.manager [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] Build of instance 0540a4dc-1b86-4776-b633-f540af168a2b was re-scheduled: A specified parameter was not correct: fileType [ 927.347421] env[68906]: Faults: ['InvalidArgument'] {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 927.347792] env[68906]: DEBUG nova.compute.manager [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] Unplugging VIFs for instance {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 927.347961] env[68906]: DEBUG nova.compute.manager [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 927.348142] env[68906]: DEBUG nova.compute.manager [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 927.348305] env[68906]: DEBUG nova.network.neutron [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 927.850363] env[68906]: DEBUG nova.network.neutron [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 927.866543] env[68906]: INFO nova.compute.manager [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] Took 0.52 seconds to deallocate network for instance. [ 927.975552] env[68906]: INFO nova.scheduler.client.report [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Deleted allocations for instance 0540a4dc-1b86-4776-b633-f540af168a2b [ 927.995875] env[68906]: DEBUG oslo_concurrency.lockutils [None req-96841733-5efb-421f-b645-bab4686e4f9f tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Lock "0540a4dc-1b86-4776-b633-f540af168a2b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 290.465s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 927.998919] env[68906]: DEBUG oslo_concurrency.lockutils [None req-cb0f3ac9-ddb7-4977-ac45-2a85841a42ce tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Lock "0540a4dc-1b86-4776-b633-f540af168a2b" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 88.685s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 927.998919] env[68906]: DEBUG oslo_concurrency.lockutils [None req-cb0f3ac9-ddb7-4977-ac45-2a85841a42ce tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Acquiring lock "0540a4dc-1b86-4776-b633-f540af168a2b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 927.998919] env[68906]: DEBUG oslo_concurrency.lockutils [None req-cb0f3ac9-ddb7-4977-ac45-2a85841a42ce tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Lock "0540a4dc-1b86-4776-b633-f540af168a2b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 927.999158] env[68906]: DEBUG oslo_concurrency.lockutils [None req-cb0f3ac9-ddb7-4977-ac45-2a85841a42ce tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Lock "0540a4dc-1b86-4776-b633-f540af168a2b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 928.001665] env[68906]: INFO nova.compute.manager [None req-cb0f3ac9-ddb7-4977-ac45-2a85841a42ce tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] Terminating instance [ 928.005035] env[68906]: DEBUG nova.compute.manager [None req-cb0f3ac9-ddb7-4977-ac45-2a85841a42ce tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 928.005035] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-cb0f3ac9-ddb7-4977-ac45-2a85841a42ce tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 928.005422] env[68906]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-f592a8db-c6e7-454a-8758-aa44d2b9fb9a {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 928.011227] env[68906]: DEBUG nova.compute.manager [None req-dc466130-d812-4b39-b2cd-70750d4485d7 tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] [instance: 653c016d-c596-4f45-a18e-55f2d1935166] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 928.022019] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f25a94c6-68f6-47ad-b33f-3eea0463ff13 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 928.036622] env[68906]: DEBUG nova.compute.manager [None req-dc466130-d812-4b39-b2cd-70750d4485d7 tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] [instance: 653c016d-c596-4f45-a18e-55f2d1935166] Instance disappeared before build. {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 928.052134] env[68906]: WARNING nova.virt.vmwareapi.vmops [None req-cb0f3ac9-ddb7-4977-ac45-2a85841a42ce tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 0540a4dc-1b86-4776-b633-f540af168a2b could not be found. [ 928.052339] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-cb0f3ac9-ddb7-4977-ac45-2a85841a42ce tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 928.052514] env[68906]: INFO nova.compute.manager [None req-cb0f3ac9-ddb7-4977-ac45-2a85841a42ce tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] Took 0.05 seconds to destroy the instance on the hypervisor. [ 928.052749] env[68906]: DEBUG oslo.service.loopingcall [None req-cb0f3ac9-ddb7-4977-ac45-2a85841a42ce tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 928.052972] env[68906]: DEBUG nova.compute.manager [-] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 928.053085] env[68906]: DEBUG nova.network.neutron [-] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 928.068423] env[68906]: DEBUG oslo_concurrency.lockutils [None req-dc466130-d812-4b39-b2cd-70750d4485d7 tempest-DeleteServersAdminTestJSON-1668331877 tempest-DeleteServersAdminTestJSON-1668331877-project-member] Lock "653c016d-c596-4f45-a18e-55f2d1935166" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 239.249s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 928.080629] env[68906]: DEBUG nova.compute.manager [None req-2c716e36-1965-4ec1-9f11-d2bddcd0b495 tempest-ServersTestJSON-364002111 tempest-ServersTestJSON-364002111-project-member] [instance: 627c0227-72ca-4a77-aca1-bc3112955e7a] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 928.108232] env[68906]: DEBUG nova.compute.manager [None req-2c716e36-1965-4ec1-9f11-d2bddcd0b495 tempest-ServersTestJSON-364002111 tempest-ServersTestJSON-364002111-project-member] [instance: 627c0227-72ca-4a77-aca1-bc3112955e7a] Instance disappeared before build. {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 928.131577] env[68906]: DEBUG oslo_concurrency.lockutils [None req-2c716e36-1965-4ec1-9f11-d2bddcd0b495 tempest-ServersTestJSON-364002111 tempest-ServersTestJSON-364002111-project-member] Lock "627c0227-72ca-4a77-aca1-bc3112955e7a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 237.467s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 928.151086] env[68906]: DEBUG nova.network.neutron [-] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 928.154616] env[68906]: DEBUG nova.compute.manager [None req-639c9c99-7440-4229-8f59-1abf591f4d11 tempest-ServersAdmin275Test-949005569 tempest-ServersAdmin275Test-949005569-project-member] [instance: 03e8dff3-b6b8-4754-8725-dddc9f9e6216] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 928.170885] env[68906]: INFO nova.compute.manager [-] [instance: 0540a4dc-1b86-4776-b633-f540af168a2b] Took 0.12 seconds to deallocate network for instance. [ 928.179130] env[68906]: DEBUG nova.compute.manager [None req-639c9c99-7440-4229-8f59-1abf591f4d11 tempest-ServersAdmin275Test-949005569 tempest-ServersAdmin275Test-949005569-project-member] [instance: 03e8dff3-b6b8-4754-8725-dddc9f9e6216] Instance disappeared before build. {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 928.202277] env[68906]: DEBUG oslo_concurrency.lockutils [None req-639c9c99-7440-4229-8f59-1abf591f4d11 tempest-ServersAdmin275Test-949005569 tempest-ServersAdmin275Test-949005569-project-member] Lock "03e8dff3-b6b8-4754-8725-dddc9f9e6216" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 234.645s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 928.213905] env[68906]: DEBUG nova.compute.manager [None req-dd87d48f-02f7-4c99-a534-1093df4a8f74 tempest-InstanceActionsNegativeTestJSON-1585889666 tempest-InstanceActionsNegativeTestJSON-1585889666-project-member] [instance: d3c5fdf4-a775-4b88-9bc2-ce9f31a9e6ac] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 928.238311] env[68906]: DEBUG nova.compute.manager [None req-dd87d48f-02f7-4c99-a534-1093df4a8f74 tempest-InstanceActionsNegativeTestJSON-1585889666 tempest-InstanceActionsNegativeTestJSON-1585889666-project-member] [instance: d3c5fdf4-a775-4b88-9bc2-ce9f31a9e6ac] Instance disappeared before build. {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 928.268836] env[68906]: DEBUG oslo_concurrency.lockutils [None req-dd87d48f-02f7-4c99-a534-1093df4a8f74 tempest-InstanceActionsNegativeTestJSON-1585889666 tempest-InstanceActionsNegativeTestJSON-1585889666-project-member] Lock "d3c5fdf4-a775-4b88-9bc2-ce9f31a9e6ac" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 230.780s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 928.285703] env[68906]: DEBUG oslo_concurrency.lockutils [None req-cb0f3ac9-ddb7-4977-ac45-2a85841a42ce tempest-ServerActionsTestJSON-1120749817 tempest-ServerActionsTestJSON-1120749817-project-member] Lock "0540a4dc-1b86-4776-b633-f540af168a2b" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.288s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 928.291430] env[68906]: DEBUG nova.compute.manager [None req-41575b90-eab8-4bb8-a519-30f8f4618f78 tempest-ImagesOneServerNegativeTestJSON-982035189 tempest-ImagesOneServerNegativeTestJSON-982035189-project-member] [instance: 242433e2-5b59-4b19-ba8d-80432ee4b7b7] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 928.316746] env[68906]: DEBUG nova.compute.manager [None req-41575b90-eab8-4bb8-a519-30f8f4618f78 tempest-ImagesOneServerNegativeTestJSON-982035189 tempest-ImagesOneServerNegativeTestJSON-982035189-project-member] [instance: 242433e2-5b59-4b19-ba8d-80432ee4b7b7] Instance disappeared before build. {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 928.337847] env[68906]: DEBUG oslo_concurrency.lockutils [None req-41575b90-eab8-4bb8-a519-30f8f4618f78 tempest-ImagesOneServerNegativeTestJSON-982035189 tempest-ImagesOneServerNegativeTestJSON-982035189-project-member] Lock "242433e2-5b59-4b19-ba8d-80432ee4b7b7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 227.102s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 928.348605] env[68906]: DEBUG nova.compute.manager [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] [instance: acc11633-a489-4d8f-ad76-f17049a91545] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 928.399532] env[68906]: DEBUG oslo_concurrency.lockutils [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 928.399791] env[68906]: DEBUG oslo_concurrency.lockutils [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 928.401282] env[68906]: INFO nova.compute.claims [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] [instance: acc11633-a489-4d8f-ad76-f17049a91545] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 928.749223] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-631334fe-da01-4c0f-8c8e-cfbb050f4bf6 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 928.757077] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b78aaebd-d677-4427-a260-a303b80579c5 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 928.788596] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-321c4437-c17c-46b8-bb10-047da0b0e792 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 928.796149] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d8b0c78-8a12-42cc-92a7-24b780c9d9e1 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 928.809605] env[68906]: DEBUG nova.compute.provider_tree [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 928.819012] env[68906]: DEBUG nova.scheduler.client.report [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 928.834688] env[68906]: DEBUG oslo_concurrency.lockutils [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.435s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 928.835289] env[68906]: DEBUG nova.compute.manager [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] [instance: acc11633-a489-4d8f-ad76-f17049a91545] Start building networks asynchronously for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 928.869018] env[68906]: DEBUG nova.compute.utils [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Using /dev/sd instead of None {{(pid=68906) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 928.871827] env[68906]: DEBUG nova.compute.manager [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] [instance: acc11633-a489-4d8f-ad76-f17049a91545] Allocating IP information in the background. {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 928.872011] env[68906]: DEBUG nova.network.neutron [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] [instance: acc11633-a489-4d8f-ad76-f17049a91545] allocate_for_instance() {{(pid=68906) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 928.887237] env[68906]: DEBUG nova.compute.manager [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] [instance: acc11633-a489-4d8f-ad76-f17049a91545] Start building block device mappings for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 928.954752] env[68906]: DEBUG nova.compute.manager [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] [instance: acc11633-a489-4d8f-ad76-f17049a91545] Start spawning the instance on the hypervisor. {{(pid=68906) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 928.983088] env[68906]: DEBUG nova.virt.hardware [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T13:00:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T13:00:23Z,direct_url=,disk_format='vmdk',id=b1400c31-d33b-4e13-944f-4c645e62493e,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='1ae7bf3a375d41c6af5e7536af51ffd1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T13:00:24Z,virtual_size=,visibility=), allow threads: False {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 928.983212] env[68906]: DEBUG nova.virt.hardware [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Flavor limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 928.983358] env[68906]: DEBUG nova.virt.hardware [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Image limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 928.983550] env[68906]: DEBUG nova.virt.hardware [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Flavor pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 928.983697] env[68906]: DEBUG nova.virt.hardware [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Image pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 928.983843] env[68906]: DEBUG nova.virt.hardware [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 928.984063] env[68906]: DEBUG nova.virt.hardware [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 928.984227] env[68906]: DEBUG nova.virt.hardware [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 928.984395] env[68906]: DEBUG nova.virt.hardware [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Got 1 possible topologies {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 928.984557] env[68906]: DEBUG nova.virt.hardware [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 928.984742] env[68906]: DEBUG nova.virt.hardware [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 928.985632] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ccd599a-838b-4c4d-8587-e7d03539e650 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 928.994255] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09e58680-777e-4c0c-8e54-bf74b8242e0a {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 929.008498] env[68906]: DEBUG nova.policy [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '792bb6cce04249f8b08670ddb465bece', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '33d271be197f41f796544deefffa1975', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68906) authorize /opt/stack/nova/nova/policy.py:203}} [ 929.761839] env[68906]: DEBUG nova.network.neutron [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] [instance: acc11633-a489-4d8f-ad76-f17049a91545] Successfully created port: 5c02ca39-9006-4fe0-b717-b26f0113fdc6 {{(pid=68906) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 931.039098] env[68906]: DEBUG nova.compute.manager [req-00ec455d-63c6-4198-ab15-0af898441ba7 req-86870d69-b2b1-4f23-8b98-672e1c7e01d4 service nova] [instance: acc11633-a489-4d8f-ad76-f17049a91545] Received event network-vif-plugged-5c02ca39-9006-4fe0-b717-b26f0113fdc6 {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 931.039506] env[68906]: DEBUG oslo_concurrency.lockutils [req-00ec455d-63c6-4198-ab15-0af898441ba7 req-86870d69-b2b1-4f23-8b98-672e1c7e01d4 service nova] Acquiring lock "acc11633-a489-4d8f-ad76-f17049a91545-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 931.039838] env[68906]: DEBUG oslo_concurrency.lockutils [req-00ec455d-63c6-4198-ab15-0af898441ba7 req-86870d69-b2b1-4f23-8b98-672e1c7e01d4 service nova] Lock "acc11633-a489-4d8f-ad76-f17049a91545-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 931.040090] env[68906]: DEBUG oslo_concurrency.lockutils [req-00ec455d-63c6-4198-ab15-0af898441ba7 req-86870d69-b2b1-4f23-8b98-672e1c7e01d4 service nova] Lock "acc11633-a489-4d8f-ad76-f17049a91545-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 931.040342] env[68906]: DEBUG nova.compute.manager [req-00ec455d-63c6-4198-ab15-0af898441ba7 req-86870d69-b2b1-4f23-8b98-672e1c7e01d4 service nova] [instance: acc11633-a489-4d8f-ad76-f17049a91545] No waiting events found dispatching network-vif-plugged-5c02ca39-9006-4fe0-b717-b26f0113fdc6 {{(pid=68906) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 931.040632] env[68906]: WARNING nova.compute.manager [req-00ec455d-63c6-4198-ab15-0af898441ba7 req-86870d69-b2b1-4f23-8b98-672e1c7e01d4 service nova] [instance: acc11633-a489-4d8f-ad76-f17049a91545] Received unexpected event network-vif-plugged-5c02ca39-9006-4fe0-b717-b26f0113fdc6 for instance with vm_state building and task_state spawning. [ 931.116749] env[68906]: DEBUG nova.network.neutron [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] [instance: acc11633-a489-4d8f-ad76-f17049a91545] Successfully updated port: 5c02ca39-9006-4fe0-b717-b26f0113fdc6 {{(pid=68906) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 931.132061] env[68906]: DEBUG oslo_concurrency.lockutils [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Acquiring lock "refresh_cache-acc11633-a489-4d8f-ad76-f17049a91545" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 931.132325] env[68906]: DEBUG oslo_concurrency.lockutils [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Acquired lock "refresh_cache-acc11633-a489-4d8f-ad76-f17049a91545" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 931.132525] env[68906]: DEBUG nova.network.neutron [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] [instance: acc11633-a489-4d8f-ad76-f17049a91545] Building network info cache for instance {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 931.213994] env[68906]: DEBUG nova.network.neutron [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] [instance: acc11633-a489-4d8f-ad76-f17049a91545] Instance cache missing network info. {{(pid=68906) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 931.531268] env[68906]: DEBUG nova.network.neutron [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] [instance: acc11633-a489-4d8f-ad76-f17049a91545] Updating instance_info_cache with network_info: [{"id": "5c02ca39-9006-4fe0-b717-b26f0113fdc6", "address": "fa:16:3e:46:6d:d1", "network": {"id": "63efabfb-0028-4758-9626-5f9860440121", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.208", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "1ae7bf3a375d41c6af5e7536af51ffd1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "69054a13-b7ef-44e1-bd3b-3ca5ba602848", "external-id": "nsx-vlan-transportzone-153", "segmentation_id": 153, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5c02ca39-90", "ovs_interfaceid": "5c02ca39-9006-4fe0-b717-b26f0113fdc6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 931.545396] env[68906]: DEBUG oslo_concurrency.lockutils [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Releasing lock "refresh_cache-acc11633-a489-4d8f-ad76-f17049a91545" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 931.545743] env[68906]: DEBUG nova.compute.manager [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] [instance: acc11633-a489-4d8f-ad76-f17049a91545] Instance network_info: |[{"id": "5c02ca39-9006-4fe0-b717-b26f0113fdc6", "address": "fa:16:3e:46:6d:d1", "network": {"id": "63efabfb-0028-4758-9626-5f9860440121", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.208", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "1ae7bf3a375d41c6af5e7536af51ffd1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "69054a13-b7ef-44e1-bd3b-3ca5ba602848", "external-id": "nsx-vlan-transportzone-153", "segmentation_id": 153, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5c02ca39-90", "ovs_interfaceid": "5c02ca39-9006-4fe0-b717-b26f0113fdc6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 931.546141] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] [instance: acc11633-a489-4d8f-ad76-f17049a91545] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:46:6d:d1', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '69054a13-b7ef-44e1-bd3b-3ca5ba602848', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '5c02ca39-9006-4fe0-b717-b26f0113fdc6', 'vif_model': 'vmxnet3'}] {{(pid=68906) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 931.554534] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Creating folder: Project (33d271be197f41f796544deefffa1975). Parent ref: group-v694750. {{(pid=68906) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 931.555120] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-3fdf0681-1b35-4a1e-8d76-0480410c2f0f {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 931.566099] env[68906]: INFO nova.virt.vmwareapi.vm_util [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Created folder: Project (33d271be197f41f796544deefffa1975) in parent group-v694750. [ 931.566295] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Creating folder: Instances. Parent ref: group-v694802. {{(pid=68906) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 931.566597] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-6489733c-febd-4321-b0eb-5c1a93aa4fde {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 931.575331] env[68906]: INFO nova.virt.vmwareapi.vm_util [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Created folder: Instances in parent group-v694802. [ 931.575621] env[68906]: DEBUG oslo.service.loopingcall [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 931.575817] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: acc11633-a489-4d8f-ad76-f17049a91545] Creating VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 931.576030] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-d1a0851e-21a6-403c-a1db-6e546d70f60d {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 931.595162] env[68906]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 931.595162] env[68906]: value = "task-3475336" [ 931.595162] env[68906]: _type = "Task" [ 931.595162] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 931.603322] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475336, 'name': CreateVM_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 932.108624] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475336, 'name': CreateVM_Task, 'duration_secs': 0.304625} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 932.108624] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: acc11633-a489-4d8f-ad76-f17049a91545] Created VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 932.109071] env[68906]: DEBUG oslo_concurrency.lockutils [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 932.109708] env[68906]: DEBUG oslo_concurrency.lockutils [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 932.109708] env[68906]: DEBUG oslo_concurrency.lockutils [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 932.109889] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ab03cda9-09c2-4097-9ab6-70914381d853 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 932.115773] env[68906]: DEBUG oslo_vmware.api [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Waiting for the task: (returnval){ [ 932.115773] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]520e6d7a-2f2e-5014-dcd2-f7d286cdab96" [ 932.115773] env[68906]: _type = "Task" [ 932.115773] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 932.125207] env[68906]: DEBUG oslo_vmware.api [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]520e6d7a-2f2e-5014-dcd2-f7d286cdab96, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 932.626719] env[68906]: DEBUG oslo_concurrency.lockutils [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 932.626993] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] [instance: acc11633-a489-4d8f-ad76-f17049a91545] Processing image b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 932.627222] env[68906]: DEBUG oslo_concurrency.lockutils [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 933.311125] env[68906]: DEBUG nova.compute.manager [req-c056edb7-0bed-42e7-825b-f877385cf8dc req-63c754ee-3a05-4228-bc60-209526e9092e service nova] [instance: acc11633-a489-4d8f-ad76-f17049a91545] Received event network-changed-5c02ca39-9006-4fe0-b717-b26f0113fdc6 {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 933.311378] env[68906]: DEBUG nova.compute.manager [req-c056edb7-0bed-42e7-825b-f877385cf8dc req-63c754ee-3a05-4228-bc60-209526e9092e service nova] [instance: acc11633-a489-4d8f-ad76-f17049a91545] Refreshing instance network info cache due to event network-changed-5c02ca39-9006-4fe0-b717-b26f0113fdc6. {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 933.311559] env[68906]: DEBUG oslo_concurrency.lockutils [req-c056edb7-0bed-42e7-825b-f877385cf8dc req-63c754ee-3a05-4228-bc60-209526e9092e service nova] Acquiring lock "refresh_cache-acc11633-a489-4d8f-ad76-f17049a91545" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 933.312025] env[68906]: DEBUG oslo_concurrency.lockutils [req-c056edb7-0bed-42e7-825b-f877385cf8dc req-63c754ee-3a05-4228-bc60-209526e9092e service nova] Acquired lock "refresh_cache-acc11633-a489-4d8f-ad76-f17049a91545" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 933.312025] env[68906]: DEBUG nova.network.neutron [req-c056edb7-0bed-42e7-825b-f877385cf8dc req-63c754ee-3a05-4228-bc60-209526e9092e service nova] [instance: acc11633-a489-4d8f-ad76-f17049a91545] Refreshing network info cache for port 5c02ca39-9006-4fe0-b717-b26f0113fdc6 {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 933.761205] env[68906]: DEBUG nova.network.neutron [req-c056edb7-0bed-42e7-825b-f877385cf8dc req-63c754ee-3a05-4228-bc60-209526e9092e service nova] [instance: acc11633-a489-4d8f-ad76-f17049a91545] Updated VIF entry in instance network info cache for port 5c02ca39-9006-4fe0-b717-b26f0113fdc6. {{(pid=68906) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 933.761631] env[68906]: DEBUG nova.network.neutron [req-c056edb7-0bed-42e7-825b-f877385cf8dc req-63c754ee-3a05-4228-bc60-209526e9092e service nova] [instance: acc11633-a489-4d8f-ad76-f17049a91545] Updating instance_info_cache with network_info: [{"id": "5c02ca39-9006-4fe0-b717-b26f0113fdc6", "address": "fa:16:3e:46:6d:d1", "network": {"id": "63efabfb-0028-4758-9626-5f9860440121", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.208", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "1ae7bf3a375d41c6af5e7536af51ffd1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "69054a13-b7ef-44e1-bd3b-3ca5ba602848", "external-id": "nsx-vlan-transportzone-153", "segmentation_id": 153, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5c02ca39-90", "ovs_interfaceid": "5c02ca39-9006-4fe0-b717-b26f0113fdc6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 933.782294] env[68906]: DEBUG oslo_concurrency.lockutils [req-c056edb7-0bed-42e7-825b-f877385cf8dc req-63c754ee-3a05-4228-bc60-209526e9092e service nova] Releasing lock "refresh_cache-acc11633-a489-4d8f-ad76-f17049a91545" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 935.045819] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Acquiring lock "4d36bb91-0cde-44cb-8706-d17740a9cf50" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 935.046148] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Lock "4d36bb91-0cde-44cb-8706-d17740a9cf50" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 936.009142] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 936.135727] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 936.159473] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 936.159473] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 937.141861] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 937.142198] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 937.142359] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Starting heal instance info cache {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 937.142470] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Rebuilding the list of instances to heal {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 937.162839] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 937.163012] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 937.163178] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 937.163307] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 937.163432] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 937.163560] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 937.163680] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 937.163794] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 937.163929] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 937.164032] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: acc11633-a489-4d8f-ad76-f17049a91545] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 937.164157] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Didn't find any instances for network info cache update. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 938.140758] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 938.140924] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68906) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 938.781754] env[68906]: DEBUG oslo_concurrency.lockutils [None req-43520988-2e5c-4ea6-8d31-9e7f370b0b01 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Acquiring lock "acc11633-a489-4d8f-ad76-f17049a91545" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 940.140619] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 940.140949] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 941.139986] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager.update_available_resource {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 941.150484] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 941.150821] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 941.150878] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 941.151030] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68906) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 941.152094] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fa43d360-dbfd-40fa-a361-44dd66d5f2cf {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 941.160670] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d284393-ddf4-4f16-b81b-cf85c1325b55 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 941.174613] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a3c54288-7bd6-479d-b85e-c81a10bfad02 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 941.180673] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e1e52eab-652e-43a0-a95d-bcc2a382f773 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 941.212742] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180961MB free_disk=93GB free_vcpus=48 pci_devices=None {{(pid=68906) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 941.212888] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 941.213110] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 941.288933] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 4edb8b9f-b608-4be8-bfd3-65642710f9bd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 941.288933] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance d6ca51b9-b284-405c-878e-fdbc326b73e1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 941.288933] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance f42056e5-52cb-4d69-8022-ca643c49194e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 941.288933] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance ce63789a-1f0f-40ca-8368-ac3f84bb58cd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 941.289203] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 13eebe4e-5984-46c3-bb73-cd783ad45df6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 941.289203] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 9a2d2803-34b1-40f7-9349-e5734a217e18 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 941.289203] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance a7e0a28f-42a5-442e-b962-07771d2e6a27 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 941.289203] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance eb81e9b1-b573-4d7c-9ede-f8b32a43a201 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 941.289340] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 941.289340] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance acc11633-a489-4d8f-ad76-f17049a91545 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 941.302146] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance e7286888-d79d-4632-9c06-69c1ef47fa50 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 941.313165] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 641cca5b-d749-4331-a5e0-8acb6d47cba2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 941.322991] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 6d7b4648-a12f-4c3c-8465-b8fb37eb0d3c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 941.334180] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance ad955cdc-85f1-4096-b2ec-7635d289ee57 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 941.345476] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 941.357120] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance a37ef3ce-1c29-48fe-b9c6-023da5b3db71 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 941.373877] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance ee17e223-bec7-4541-8cb2-25e4a6c32b34 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 941.385014] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 3ba4a60f-6c41-4e1e-8928-f1b95b885028 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 941.395096] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 56f936b4-680d-40db-84ab-8eb319f6ee83 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 941.405739] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance faec727e-bd92-4201-aaca-5863208be265 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 941.416454] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 18a5c392-b836-4d2a-bb77-d4af0b9fdb81 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 941.426418] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 2e6de0b1-335b-49bd-aa15-3fd4cc4b4e9e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 941.439930] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance d71bae07-54c1-427b-bfe1-2467369cd80c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 941.450692] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 682f0e61-471f-47fb-98de-02449b17d241 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 941.460942] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 4d36bb91-0cde-44cb-8706-d17740a9cf50 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 941.461221] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 941.461371] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 941.761798] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a5e20d96-a9ab-4b97-a74a-d3afb501cff7 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 941.770525] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b5ad564-26a7-412f-b600-ee7d3a7f0d2b {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 941.800299] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5043dd03-8338-478f-91d7-045e3fd158fa {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 941.807554] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-17e06d9f-a375-4315-ae85-0c3c6641ff35 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 941.821226] env[68906]: DEBUG nova.compute.provider_tree [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 941.829952] env[68906]: DEBUG nova.scheduler.client.report [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 941.843411] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68906) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 941.843641] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.630s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 975.591648] env[68906]: WARNING oslo_vmware.rw_handles [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 975.591648] env[68906]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 975.591648] env[68906]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 975.591648] env[68906]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 975.591648] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 975.591648] env[68906]: ERROR oslo_vmware.rw_handles response.begin() [ 975.591648] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 975.591648] env[68906]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 975.591648] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 975.591648] env[68906]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 975.591648] env[68906]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 975.591648] env[68906]: ERROR oslo_vmware.rw_handles [ 975.591648] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Downloaded image file data b1400c31-d33b-4e13-944f-4c645e62493e to vmware_temp/4c279939-0d5e-4c87-b814-e9a22431f9d2/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 975.592986] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Caching image {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 975.593366] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Copying Virtual Disk [datastore2] vmware_temp/4c279939-0d5e-4c87-b814-e9a22431f9d2/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk to [datastore2] vmware_temp/4c279939-0d5e-4c87-b814-e9a22431f9d2/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk {{(pid=68906) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 975.593583] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-2eb6ff60-9817-4318-998d-06d4af9bf866 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 975.602033] env[68906]: DEBUG oslo_vmware.api [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Waiting for the task: (returnval){ [ 975.602033] env[68906]: value = "task-3475337" [ 975.602033] env[68906]: _type = "Task" [ 975.602033] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 975.610371] env[68906]: DEBUG oslo_vmware.api [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Task: {'id': task-3475337, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 976.112633] env[68906]: DEBUG oslo_vmware.exceptions [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Fault InvalidArgument not matched. {{(pid=68906) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 976.112989] env[68906]: DEBUG oslo_concurrency.lockutils [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 976.113628] env[68906]: ERROR nova.compute.manager [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 976.113628] env[68906]: Faults: ['InvalidArgument'] [ 976.113628] env[68906]: ERROR nova.compute.manager [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Traceback (most recent call last): [ 976.113628] env[68906]: ERROR nova.compute.manager [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 976.113628] env[68906]: ERROR nova.compute.manager [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] yield resources [ 976.113628] env[68906]: ERROR nova.compute.manager [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 976.113628] env[68906]: ERROR nova.compute.manager [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] self.driver.spawn(context, instance, image_meta, [ 976.113628] env[68906]: ERROR nova.compute.manager [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 976.113628] env[68906]: ERROR nova.compute.manager [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] self._vmops.spawn(context, instance, image_meta, injected_files, [ 976.113628] env[68906]: ERROR nova.compute.manager [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 976.113628] env[68906]: ERROR nova.compute.manager [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] self._fetch_image_if_missing(context, vi) [ 976.113628] env[68906]: ERROR nova.compute.manager [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 976.114055] env[68906]: ERROR nova.compute.manager [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] image_cache(vi, tmp_image_ds_loc) [ 976.114055] env[68906]: ERROR nova.compute.manager [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 976.114055] env[68906]: ERROR nova.compute.manager [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] vm_util.copy_virtual_disk( [ 976.114055] env[68906]: ERROR nova.compute.manager [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 976.114055] env[68906]: ERROR nova.compute.manager [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] session._wait_for_task(vmdk_copy_task) [ 976.114055] env[68906]: ERROR nova.compute.manager [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 976.114055] env[68906]: ERROR nova.compute.manager [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] return self.wait_for_task(task_ref) [ 976.114055] env[68906]: ERROR nova.compute.manager [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 976.114055] env[68906]: ERROR nova.compute.manager [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] return evt.wait() [ 976.114055] env[68906]: ERROR nova.compute.manager [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 976.114055] env[68906]: ERROR nova.compute.manager [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] result = hub.switch() [ 976.114055] env[68906]: ERROR nova.compute.manager [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 976.114055] env[68906]: ERROR nova.compute.manager [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] return self.greenlet.switch() [ 976.114659] env[68906]: ERROR nova.compute.manager [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 976.114659] env[68906]: ERROR nova.compute.manager [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] self.f(*self.args, **self.kw) [ 976.114659] env[68906]: ERROR nova.compute.manager [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 976.114659] env[68906]: ERROR nova.compute.manager [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] raise exceptions.translate_fault(task_info.error) [ 976.114659] env[68906]: ERROR nova.compute.manager [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 976.114659] env[68906]: ERROR nova.compute.manager [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Faults: ['InvalidArgument'] [ 976.114659] env[68906]: ERROR nova.compute.manager [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] [ 976.114659] env[68906]: INFO nova.compute.manager [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Terminating instance [ 976.115574] env[68906]: DEBUG oslo_concurrency.lockutils [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 976.115701] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 976.116334] env[68906]: DEBUG nova.compute.manager [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 976.116522] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 976.116740] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ee484bc4-e364-4926-b977-4707b65d019b {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 976.119358] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd59a16d-9be5-4553-897b-0a0550eb36b7 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 976.125903] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Unregistering the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 976.126119] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-31b67a1f-3a3d-4913-b914-6779a95fbd8f {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 976.128558] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 976.128730] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68906) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 976.129675] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8e261b99-b670-4a16-87a0-f33bd35676d7 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 976.134226] env[68906]: DEBUG oslo_vmware.api [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Waiting for the task: (returnval){ [ 976.134226] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]5280f69d-6696-4204-ee37-d934c8ca4707" [ 976.134226] env[68906]: _type = "Task" [ 976.134226] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 976.141155] env[68906]: DEBUG oslo_vmware.api [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]5280f69d-6696-4204-ee37-d934c8ca4707, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 976.193588] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Unregistered the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 976.193815] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Deleting contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 976.193994] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Deleting the datastore file [datastore2] 4edb8b9f-b608-4be8-bfd3-65642710f9bd {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 976.194274] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-af402beb-3595-4e99-bb84-d824db92ff8f {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 976.200821] env[68906]: DEBUG oslo_vmware.api [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Waiting for the task: (returnval){ [ 976.200821] env[68906]: value = "task-3475339" [ 976.200821] env[68906]: _type = "Task" [ 976.200821] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 976.208605] env[68906]: DEBUG oslo_vmware.api [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Task: {'id': task-3475339, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 976.644791] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Preparing fetch location {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 976.645150] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Creating directory with path [datastore2] vmware_temp/e3c62a77-aea4-4be1-9282-8082437ef873/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 976.645300] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9597a7ea-cf09-42ca-955c-4ed4d4ca7594 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 976.656878] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Created directory with path [datastore2] vmware_temp/e3c62a77-aea4-4be1-9282-8082437ef873/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 976.657079] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Fetch image to [datastore2] vmware_temp/e3c62a77-aea4-4be1-9282-8082437ef873/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 976.657277] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to [datastore2] vmware_temp/e3c62a77-aea4-4be1-9282-8082437ef873/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 976.657985] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16d45cd8-693c-4573-aa48-aeb1c0bb3e4b {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 976.664468] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4491063b-19cc-4436-8dec-489c05639dc8 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 976.673321] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-614d88be-969c-489b-b7ad-0877fb9f5a5f {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 976.706558] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-26fe03ce-2411-4747-94b8-62a35262a944 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 976.713296] env[68906]: DEBUG oslo_vmware.api [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Task: {'id': task-3475339, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.076157} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 976.714683] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Deleted the datastore file {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 976.714872] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Deleted contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 976.715054] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 976.715232] env[68906]: INFO nova.compute.manager [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Took 0.60 seconds to destroy the instance on the hypervisor. [ 976.717010] env[68906]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-1c19ecc4-4430-465a-a8a9-626ea3126416 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 976.718883] env[68906]: DEBUG nova.compute.claims [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Aborting claim: {{(pid=68906) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 976.719062] env[68906]: DEBUG oslo_concurrency.lockutils [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 976.719282] env[68906]: DEBUG oslo_concurrency.lockutils [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 976.741212] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 976.798625] env[68906]: DEBUG oslo_vmware.rw_handles [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e3c62a77-aea4-4be1-9282-8082437ef873/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 976.858556] env[68906]: DEBUG oslo_vmware.rw_handles [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Completed reading data from the image iterator. {{(pid=68906) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 976.858556] env[68906]: DEBUG oslo_vmware.rw_handles [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e3c62a77-aea4-4be1-9282-8082437ef873/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 977.099324] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-89d76a22-9d9a-4def-98e0-d31886ec2da5 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 977.107076] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cc0058d7-3815-4d1e-92c9-bc0f07dcf45b {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 977.138404] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a7f9e353-bfbd-4526-bb13-99eb09aa561b {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 977.146016] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e828049a-1455-42d7-aff2-3932d57f66fd {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 977.159582] env[68906]: DEBUG nova.compute.provider_tree [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 977.170765] env[68906]: DEBUG nova.scheduler.client.report [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 977.186423] env[68906]: DEBUG oslo_concurrency.lockutils [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.467s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 977.186947] env[68906]: ERROR nova.compute.manager [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 977.186947] env[68906]: Faults: ['InvalidArgument'] [ 977.186947] env[68906]: ERROR nova.compute.manager [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Traceback (most recent call last): [ 977.186947] env[68906]: ERROR nova.compute.manager [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 977.186947] env[68906]: ERROR nova.compute.manager [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] self.driver.spawn(context, instance, image_meta, [ 977.186947] env[68906]: ERROR nova.compute.manager [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 977.186947] env[68906]: ERROR nova.compute.manager [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] self._vmops.spawn(context, instance, image_meta, injected_files, [ 977.186947] env[68906]: ERROR nova.compute.manager [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 977.186947] env[68906]: ERROR nova.compute.manager [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] self._fetch_image_if_missing(context, vi) [ 977.186947] env[68906]: ERROR nova.compute.manager [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 977.186947] env[68906]: ERROR nova.compute.manager [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] image_cache(vi, tmp_image_ds_loc) [ 977.186947] env[68906]: ERROR nova.compute.manager [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 977.187429] env[68906]: ERROR nova.compute.manager [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] vm_util.copy_virtual_disk( [ 977.187429] env[68906]: ERROR nova.compute.manager [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 977.187429] env[68906]: ERROR nova.compute.manager [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] session._wait_for_task(vmdk_copy_task) [ 977.187429] env[68906]: ERROR nova.compute.manager [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 977.187429] env[68906]: ERROR nova.compute.manager [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] return self.wait_for_task(task_ref) [ 977.187429] env[68906]: ERROR nova.compute.manager [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 977.187429] env[68906]: ERROR nova.compute.manager [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] return evt.wait() [ 977.187429] env[68906]: ERROR nova.compute.manager [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 977.187429] env[68906]: ERROR nova.compute.manager [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] result = hub.switch() [ 977.187429] env[68906]: ERROR nova.compute.manager [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 977.187429] env[68906]: ERROR nova.compute.manager [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] return self.greenlet.switch() [ 977.187429] env[68906]: ERROR nova.compute.manager [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 977.187429] env[68906]: ERROR nova.compute.manager [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] self.f(*self.args, **self.kw) [ 977.187809] env[68906]: ERROR nova.compute.manager [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 977.187809] env[68906]: ERROR nova.compute.manager [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] raise exceptions.translate_fault(task_info.error) [ 977.187809] env[68906]: ERROR nova.compute.manager [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 977.187809] env[68906]: ERROR nova.compute.manager [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Faults: ['InvalidArgument'] [ 977.187809] env[68906]: ERROR nova.compute.manager [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] [ 977.187809] env[68906]: DEBUG nova.compute.utils [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] VimFaultException {{(pid=68906) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 977.189630] env[68906]: DEBUG nova.compute.manager [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Build of instance 4edb8b9f-b608-4be8-bfd3-65642710f9bd was re-scheduled: A specified parameter was not correct: fileType [ 977.189630] env[68906]: Faults: ['InvalidArgument'] {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 977.189956] env[68906]: DEBUG nova.compute.manager [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Unplugging VIFs for instance {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 977.190152] env[68906]: DEBUG nova.compute.manager [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 977.190370] env[68906]: DEBUG nova.compute.manager [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 977.190498] env[68906]: DEBUG nova.network.neutron [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 977.659752] env[68906]: DEBUG nova.network.neutron [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 977.679914] env[68906]: INFO nova.compute.manager [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Took 0.49 seconds to deallocate network for instance. [ 977.774061] env[68906]: INFO nova.scheduler.client.report [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Deleted allocations for instance 4edb8b9f-b608-4be8-bfd3-65642710f9bd [ 977.810707] env[68906]: DEBUG oslo_concurrency.lockutils [None req-a8230d4c-efd4-4641-85fe-bd949bd747e5 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Lock "4edb8b9f-b608-4be8-bfd3-65642710f9bd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 335.122s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 977.812844] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6333a4bb-d4a4-44d7-b760-cbd7cf2f16fd tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Lock "4edb8b9f-b608-4be8-bfd3-65642710f9bd" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 136.504s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 977.813165] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6333a4bb-d4a4-44d7-b760-cbd7cf2f16fd tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Acquiring lock "4edb8b9f-b608-4be8-bfd3-65642710f9bd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 977.813439] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6333a4bb-d4a4-44d7-b760-cbd7cf2f16fd tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Lock "4edb8b9f-b608-4be8-bfd3-65642710f9bd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 977.813670] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6333a4bb-d4a4-44d7-b760-cbd7cf2f16fd tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Lock "4edb8b9f-b608-4be8-bfd3-65642710f9bd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 977.817675] env[68906]: INFO nova.compute.manager [None req-6333a4bb-d4a4-44d7-b760-cbd7cf2f16fd tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Terminating instance [ 977.818645] env[68906]: DEBUG nova.compute.manager [None req-6333a4bb-d4a4-44d7-b760-cbd7cf2f16fd tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 977.818955] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-6333a4bb-d4a4-44d7-b760-cbd7cf2f16fd tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 977.819615] env[68906]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-c1919c83-f415-468a-b28a-f5f7432f1a59 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 977.827667] env[68906]: DEBUG nova.compute.manager [None req-ae8fe18e-f486-42a3-9628-3d30cfec0923 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] [instance: 0874bf05-e156-404e-a067-869e370fd14b] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 977.835483] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8080d6b0-95c0-4d02-a0fd-6128989a3920 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 977.864276] env[68906]: WARNING nova.virt.vmwareapi.vmops [None req-6333a4bb-d4a4-44d7-b760-cbd7cf2f16fd tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 4edb8b9f-b608-4be8-bfd3-65642710f9bd could not be found. [ 977.864545] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-6333a4bb-d4a4-44d7-b760-cbd7cf2f16fd tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 977.864764] env[68906]: INFO nova.compute.manager [None req-6333a4bb-d4a4-44d7-b760-cbd7cf2f16fd tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Took 0.05 seconds to destroy the instance on the hypervisor. [ 977.865056] env[68906]: DEBUG oslo.service.loopingcall [None req-6333a4bb-d4a4-44d7-b760-cbd7cf2f16fd tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 977.865313] env[68906]: DEBUG nova.compute.manager [-] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 977.865446] env[68906]: DEBUG nova.network.neutron [-] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 977.873648] env[68906]: DEBUG nova.compute.manager [None req-ae8fe18e-f486-42a3-9628-3d30cfec0923 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] [instance: 0874bf05-e156-404e-a067-869e370fd14b] Instance disappeared before build. {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 977.891018] env[68906]: DEBUG nova.network.neutron [-] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 977.901190] env[68906]: INFO nova.compute.manager [-] [instance: 4edb8b9f-b608-4be8-bfd3-65642710f9bd] Took 0.04 seconds to deallocate network for instance. [ 977.903710] env[68906]: DEBUG oslo_concurrency.lockutils [None req-ae8fe18e-f486-42a3-9628-3d30cfec0923 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Lock "0874bf05-e156-404e-a067-869e370fd14b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 235.198s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 977.915181] env[68906]: DEBUG nova.compute.manager [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 977.964782] env[68906]: DEBUG oslo_concurrency.lockutils [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 977.965038] env[68906]: DEBUG oslo_concurrency.lockutils [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 977.966505] env[68906]: INFO nova.compute.claims [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 977.997465] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6333a4bb-d4a4-44d7-b760-cbd7cf2f16fd tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Lock "4edb8b9f-b608-4be8-bfd3-65642710f9bd" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.185s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 978.310895] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e3d8a414-498a-443b-b415-4c9e608e9013 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 978.321220] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f879239-3a90-4cb1-85f1-ec797868b92e {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 978.357222] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e9274901-3f4d-47a4-b136-42b2f1eebe1c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 978.364535] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-740fb333-78fe-4616-a2b2-7501c9a5ae16 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 978.377691] env[68906]: DEBUG nova.compute.provider_tree [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 978.390873] env[68906]: DEBUG nova.scheduler.client.report [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 978.404583] env[68906]: DEBUG oslo_concurrency.lockutils [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.439s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 978.405227] env[68906]: DEBUG nova.compute.manager [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Start building networks asynchronously for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 978.453149] env[68906]: DEBUG nova.compute.utils [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Using /dev/sd instead of None {{(pid=68906) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 978.455453] env[68906]: DEBUG nova.compute.manager [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Allocating IP information in the background. {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 978.455736] env[68906]: DEBUG nova.network.neutron [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] allocate_for_instance() {{(pid=68906) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 978.470494] env[68906]: DEBUG nova.compute.manager [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Start building block device mappings for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 978.543400] env[68906]: DEBUG nova.compute.manager [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Start spawning the instance on the hypervisor. {{(pid=68906) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 978.561192] env[68906]: DEBUG nova.policy [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b46c06fcd3404f45abc083563415467b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'da1df204e7064662bf5c15a1598c0d4e', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68906) authorize /opt/stack/nova/nova/policy.py:203}} [ 978.572798] env[68906]: DEBUG nova.virt.hardware [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T13:00:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T13:00:23Z,direct_url=,disk_format='vmdk',id=b1400c31-d33b-4e13-944f-4c645e62493e,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='1ae7bf3a375d41c6af5e7536af51ffd1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T13:00:24Z,virtual_size=,visibility=), allow threads: False {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 978.573231] env[68906]: DEBUG nova.virt.hardware [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Flavor limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 978.573481] env[68906]: DEBUG nova.virt.hardware [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Image limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 978.573724] env[68906]: DEBUG nova.virt.hardware [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Flavor pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 978.573878] env[68906]: DEBUG nova.virt.hardware [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Image pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 978.574038] env[68906]: DEBUG nova.virt.hardware [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 978.574254] env[68906]: DEBUG nova.virt.hardware [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 978.574418] env[68906]: DEBUG nova.virt.hardware [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 978.574586] env[68906]: DEBUG nova.virt.hardware [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Got 1 possible topologies {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 978.574748] env[68906]: DEBUG nova.virt.hardware [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 978.574920] env[68906]: DEBUG nova.virt.hardware [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 978.575808] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3fac8c7-c708-48a2-b8e9-d024592a7750 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 978.583744] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9960f410-5a4b-4d09-9949-72c84ec667fe {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 979.441653] env[68906]: DEBUG nova.network.neutron [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Successfully created port: b064c1c5-c272-4c9d-b5f9-fff92c024be6 {{(pid=68906) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 980.559222] env[68906]: DEBUG nova.network.neutron [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Successfully updated port: b064c1c5-c272-4c9d-b5f9-fff92c024be6 {{(pid=68906) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 980.570324] env[68906]: DEBUG oslo_concurrency.lockutils [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Acquiring lock "refresh_cache-e7286888-d79d-4632-9c06-69c1ef47fa50" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 980.570484] env[68906]: DEBUG oslo_concurrency.lockutils [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Acquired lock "refresh_cache-e7286888-d79d-4632-9c06-69c1ef47fa50" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 980.570636] env[68906]: DEBUG nova.network.neutron [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Building network info cache for instance {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 980.688579] env[68906]: DEBUG nova.network.neutron [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Instance cache missing network info. {{(pid=68906) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 980.795921] env[68906]: DEBUG nova.compute.manager [req-5d8bc285-a1a3-44b3-89c7-280ecbeadb9d req-a8b8b4d0-4ae8-44ef-96d4-b66164ee345c service nova] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Received event network-vif-plugged-b064c1c5-c272-4c9d-b5f9-fff92c024be6 {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 980.796174] env[68906]: DEBUG oslo_concurrency.lockutils [req-5d8bc285-a1a3-44b3-89c7-280ecbeadb9d req-a8b8b4d0-4ae8-44ef-96d4-b66164ee345c service nova] Acquiring lock "e7286888-d79d-4632-9c06-69c1ef47fa50-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 980.796389] env[68906]: DEBUG oslo_concurrency.lockutils [req-5d8bc285-a1a3-44b3-89c7-280ecbeadb9d req-a8b8b4d0-4ae8-44ef-96d4-b66164ee345c service nova] Lock "e7286888-d79d-4632-9c06-69c1ef47fa50-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 980.796604] env[68906]: DEBUG oslo_concurrency.lockutils [req-5d8bc285-a1a3-44b3-89c7-280ecbeadb9d req-a8b8b4d0-4ae8-44ef-96d4-b66164ee345c service nova] Lock "e7286888-d79d-4632-9c06-69c1ef47fa50-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 980.796776] env[68906]: DEBUG nova.compute.manager [req-5d8bc285-a1a3-44b3-89c7-280ecbeadb9d req-a8b8b4d0-4ae8-44ef-96d4-b66164ee345c service nova] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] No waiting events found dispatching network-vif-plugged-b064c1c5-c272-4c9d-b5f9-fff92c024be6 {{(pid=68906) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 980.797483] env[68906]: WARNING nova.compute.manager [req-5d8bc285-a1a3-44b3-89c7-280ecbeadb9d req-a8b8b4d0-4ae8-44ef-96d4-b66164ee345c service nova] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Received unexpected event network-vif-plugged-b064c1c5-c272-4c9d-b5f9-fff92c024be6 for instance with vm_state building and task_state spawning. [ 981.014666] env[68906]: DEBUG oslo_concurrency.lockutils [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Acquiring lock "db011373-7285-4882-8bce-d39cfa22fe80" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 981.014912] env[68906]: DEBUG oslo_concurrency.lockutils [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Lock "db011373-7285-4882-8bce-d39cfa22fe80" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 981.175677] env[68906]: DEBUG nova.network.neutron [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Updating instance_info_cache with network_info: [{"id": "b064c1c5-c272-4c9d-b5f9-fff92c024be6", "address": "fa:16:3e:ef:c0:54", "network": {"id": "cbbbf860-58e5-4164-8de8-b1492ffc7605", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1183077938-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "da1df204e7064662bf5c15a1598c0d4e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6eb7e3e9-5cc2-40f1-a6eb-f70f06531667", "external-id": "nsx-vlan-transportzone-938", "segmentation_id": 938, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb064c1c5-c2", "ovs_interfaceid": "b064c1c5-c272-4c9d-b5f9-fff92c024be6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 981.188364] env[68906]: DEBUG oslo_concurrency.lockutils [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Releasing lock "refresh_cache-e7286888-d79d-4632-9c06-69c1ef47fa50" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 981.188694] env[68906]: DEBUG nova.compute.manager [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Instance network_info: |[{"id": "b064c1c5-c272-4c9d-b5f9-fff92c024be6", "address": "fa:16:3e:ef:c0:54", "network": {"id": "cbbbf860-58e5-4164-8de8-b1492ffc7605", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1183077938-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "da1df204e7064662bf5c15a1598c0d4e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6eb7e3e9-5cc2-40f1-a6eb-f70f06531667", "external-id": "nsx-vlan-transportzone-938", "segmentation_id": 938, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb064c1c5-c2", "ovs_interfaceid": "b064c1c5-c272-4c9d-b5f9-fff92c024be6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 981.189128] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ef:c0:54', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '6eb7e3e9-5cc2-40f1-a6eb-f70f06531667', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'b064c1c5-c272-4c9d-b5f9-fff92c024be6', 'vif_model': 'vmxnet3'}] {{(pid=68906) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 981.197018] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Creating folder: Project (da1df204e7064662bf5c15a1598c0d4e). Parent ref: group-v694750. {{(pid=68906) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 981.197944] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e5967413-c3ef-4525-8758-a99e5836ae1b {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 981.209966] env[68906]: INFO nova.virt.vmwareapi.vm_util [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Created folder: Project (da1df204e7064662bf5c15a1598c0d4e) in parent group-v694750. [ 981.210182] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Creating folder: Instances. Parent ref: group-v694805. {{(pid=68906) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 981.210423] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d44cdc9a-8f8e-457a-b972-3b5f3dafd14b {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 981.221031] env[68906]: INFO nova.virt.vmwareapi.vm_util [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Created folder: Instances in parent group-v694805. [ 981.221031] env[68906]: DEBUG oslo.service.loopingcall [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 981.221031] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Creating VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 981.221389] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-db2f1cc4-d644-4882-ab8a-4a7ca5f09cce {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 981.241983] env[68906]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 981.241983] env[68906]: value = "task-3475342" [ 981.241983] env[68906]: _type = "Task" [ 981.241983] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 981.249548] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475342, 'name': CreateVM_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 981.751555] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475342, 'name': CreateVM_Task, 'duration_secs': 0.299167} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 981.751810] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Created VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 981.752450] env[68906]: DEBUG oslo_concurrency.lockutils [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 981.752627] env[68906]: DEBUG oslo_concurrency.lockutils [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 981.752952] env[68906]: DEBUG oslo_concurrency.lockutils [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 981.753213] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9e4748c3-7553-4126-982a-80f0d80c7f4e {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 981.758309] env[68906]: DEBUG oslo_vmware.api [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Waiting for the task: (returnval){ [ 981.758309] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]52685a4c-5b2f-fe29-d03d-9826caca7f5f" [ 981.758309] env[68906]: _type = "Task" [ 981.758309] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 981.767042] env[68906]: DEBUG oslo_vmware.api [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]52685a4c-5b2f-fe29-d03d-9826caca7f5f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 982.268411] env[68906]: DEBUG oslo_concurrency.lockutils [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 982.268703] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Processing image b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 982.268916] env[68906]: DEBUG oslo_concurrency.lockutils [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 982.885943] env[68906]: DEBUG nova.compute.manager [req-4a13ba03-5623-42c0-b56f-5fc3dffa1099 req-3deeadcb-1e9e-484b-b5f2-6950cbd7c8ae service nova] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Received event network-changed-b064c1c5-c272-4c9d-b5f9-fff92c024be6 {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 982.886207] env[68906]: DEBUG nova.compute.manager [req-4a13ba03-5623-42c0-b56f-5fc3dffa1099 req-3deeadcb-1e9e-484b-b5f2-6950cbd7c8ae service nova] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Refreshing instance network info cache due to event network-changed-b064c1c5-c272-4c9d-b5f9-fff92c024be6. {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 982.886385] env[68906]: DEBUG oslo_concurrency.lockutils [req-4a13ba03-5623-42c0-b56f-5fc3dffa1099 req-3deeadcb-1e9e-484b-b5f2-6950cbd7c8ae service nova] Acquiring lock "refresh_cache-e7286888-d79d-4632-9c06-69c1ef47fa50" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 982.886527] env[68906]: DEBUG oslo_concurrency.lockutils [req-4a13ba03-5623-42c0-b56f-5fc3dffa1099 req-3deeadcb-1e9e-484b-b5f2-6950cbd7c8ae service nova] Acquired lock "refresh_cache-e7286888-d79d-4632-9c06-69c1ef47fa50" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 982.886685] env[68906]: DEBUG nova.network.neutron [req-4a13ba03-5623-42c0-b56f-5fc3dffa1099 req-3deeadcb-1e9e-484b-b5f2-6950cbd7c8ae service nova] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Refreshing network info cache for port b064c1c5-c272-4c9d-b5f9-fff92c024be6 {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 983.356750] env[68906]: DEBUG nova.network.neutron [req-4a13ba03-5623-42c0-b56f-5fc3dffa1099 req-3deeadcb-1e9e-484b-b5f2-6950cbd7c8ae service nova] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Updated VIF entry in instance network info cache for port b064c1c5-c272-4c9d-b5f9-fff92c024be6. {{(pid=68906) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 983.357118] env[68906]: DEBUG nova.network.neutron [req-4a13ba03-5623-42c0-b56f-5fc3dffa1099 req-3deeadcb-1e9e-484b-b5f2-6950cbd7c8ae service nova] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Updating instance_info_cache with network_info: [{"id": "b064c1c5-c272-4c9d-b5f9-fff92c024be6", "address": "fa:16:3e:ef:c0:54", "network": {"id": "cbbbf860-58e5-4164-8de8-b1492ffc7605", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1183077938-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "da1df204e7064662bf5c15a1598c0d4e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6eb7e3e9-5cc2-40f1-a6eb-f70f06531667", "external-id": "nsx-vlan-transportzone-938", "segmentation_id": 938, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb064c1c5-c2", "ovs_interfaceid": "b064c1c5-c272-4c9d-b5f9-fff92c024be6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 983.366237] env[68906]: DEBUG oslo_concurrency.lockutils [req-4a13ba03-5623-42c0-b56f-5fc3dffa1099 req-3deeadcb-1e9e-484b-b5f2-6950cbd7c8ae service nova] Releasing lock "refresh_cache-e7286888-d79d-4632-9c06-69c1ef47fa50" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 986.362404] env[68906]: DEBUG oslo_concurrency.lockutils [None req-ca7f83ad-d788-4fa0-93cd-bbf7b915f100 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Acquiring lock "e7286888-d79d-4632-9c06-69c1ef47fa50" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 995.844595] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 996.140754] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 998.138560] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 998.140147] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 999.140565] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 999.140816] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Starting heal instance info cache {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 999.140900] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Rebuilding the list of instances to heal {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 999.163788] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 999.163954] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 999.164106] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 999.164234] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 999.164361] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 999.164483] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 999.164606] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 999.164725] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 999.164845] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: acc11633-a489-4d8f-ad76-f17049a91545] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 999.164965] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 999.165157] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Didn't find any instances for network info cache update. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 999.165668] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 999.165813] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68906) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1000.140597] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1001.140639] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager.update_available_resource {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1001.179217] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1001.179217] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1001.179217] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1001.179217] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68906) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1001.179217] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32abc005-fc14-4c01-9138-df11579a9517 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1001.188096] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-97836eec-8dcb-458d-b14f-871ee00c65a1 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1001.203114] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d416b1a2-134f-4ed7-8624-0a2d47f45d40 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1001.208773] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-59bb758b-710a-4111-8b3c-7baaa4073960 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1001.239631] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180967MB free_disk=93GB free_vcpus=48 pci_devices=None {{(pid=68906) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1001.239631] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1001.239631] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1001.318844] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance d6ca51b9-b284-405c-878e-fdbc326b73e1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1001.319018] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance f42056e5-52cb-4d69-8022-ca643c49194e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1001.319183] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance ce63789a-1f0f-40ca-8368-ac3f84bb58cd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1001.319319] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 13eebe4e-5984-46c3-bb73-cd783ad45df6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1001.319442] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 9a2d2803-34b1-40f7-9349-e5734a217e18 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1001.319598] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance a7e0a28f-42a5-442e-b962-07771d2e6a27 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1001.319741] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance eb81e9b1-b573-4d7c-9ede-f8b32a43a201 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1001.319857] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1001.319973] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance acc11633-a489-4d8f-ad76-f17049a91545 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1001.320102] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance e7286888-d79d-4632-9c06-69c1ef47fa50 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1001.331449] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 641cca5b-d749-4331-a5e0-8acb6d47cba2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1001.343279] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 6d7b4648-a12f-4c3c-8465-b8fb37eb0d3c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1001.354807] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance ad955cdc-85f1-4096-b2ec-7635d289ee57 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1001.365738] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1001.375824] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance a37ef3ce-1c29-48fe-b9c6-023da5b3db71 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1001.387661] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance ee17e223-bec7-4541-8cb2-25e4a6c32b34 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1001.397808] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 3ba4a60f-6c41-4e1e-8928-f1b95b885028 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1001.408380] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 56f936b4-680d-40db-84ab-8eb319f6ee83 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1001.418145] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance faec727e-bd92-4201-aaca-5863208be265 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1001.429580] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 18a5c392-b836-4d2a-bb77-d4af0b9fdb81 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1001.439860] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 2e6de0b1-335b-49bd-aa15-3fd4cc4b4e9e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1001.450621] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance d71bae07-54c1-427b-bfe1-2467369cd80c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1001.460312] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 682f0e61-471f-47fb-98de-02449b17d241 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1001.471591] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 4d36bb91-0cde-44cb-8706-d17740a9cf50 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1001.481925] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance db011373-7285-4882-8bce-d39cfa22fe80 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1001.481925] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1001.481925] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1001.765816] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-50d6e651-ba74-42f4-b5ab-64a14ea7ac16 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1001.773456] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-80cc5bc4-74c5-4152-b405-025fce222fbb {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1001.804231] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-87a89317-a186-4d11-b66c-dbdae0255466 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1001.811616] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0bdf626d-2e4f-4650-9eb5-df2c8f7ecdbf {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1001.827168] env[68906]: DEBUG nova.compute.provider_tree [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1001.838032] env[68906]: DEBUG nova.scheduler.client.report [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1001.849652] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68906) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1001.849652] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.610s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1002.850842] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1022.612651] env[68906]: WARNING oslo_vmware.rw_handles [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1022.612651] env[68906]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1022.612651] env[68906]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1022.612651] env[68906]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1022.612651] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1022.612651] env[68906]: ERROR oslo_vmware.rw_handles response.begin() [ 1022.612651] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1022.612651] env[68906]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1022.612651] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1022.612651] env[68906]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1022.612651] env[68906]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1022.612651] env[68906]: ERROR oslo_vmware.rw_handles [ 1022.613380] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Downloaded image file data b1400c31-d33b-4e13-944f-4c645e62493e to vmware_temp/e3c62a77-aea4-4be1-9282-8082437ef873/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1022.614897] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Caching image {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1022.615192] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Copying Virtual Disk [datastore2] vmware_temp/e3c62a77-aea4-4be1-9282-8082437ef873/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk to [datastore2] vmware_temp/e3c62a77-aea4-4be1-9282-8082437ef873/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk {{(pid=68906) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1022.615445] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-0c363855-2d19-45e4-acf3-07f4684ba3a5 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1022.623015] env[68906]: DEBUG oslo_vmware.api [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Waiting for the task: (returnval){ [ 1022.623015] env[68906]: value = "task-3475343" [ 1022.623015] env[68906]: _type = "Task" [ 1022.623015] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1022.631535] env[68906]: DEBUG oslo_vmware.api [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Task: {'id': task-3475343, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1023.134395] env[68906]: DEBUG oslo_vmware.exceptions [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Fault InvalidArgument not matched. {{(pid=68906) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1023.134629] env[68906]: DEBUG oslo_concurrency.lockutils [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1023.135211] env[68906]: ERROR nova.compute.manager [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1023.135211] env[68906]: Faults: ['InvalidArgument'] [ 1023.135211] env[68906]: ERROR nova.compute.manager [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Traceback (most recent call last): [ 1023.135211] env[68906]: ERROR nova.compute.manager [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1023.135211] env[68906]: ERROR nova.compute.manager [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] yield resources [ 1023.135211] env[68906]: ERROR nova.compute.manager [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1023.135211] env[68906]: ERROR nova.compute.manager [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] self.driver.spawn(context, instance, image_meta, [ 1023.135211] env[68906]: ERROR nova.compute.manager [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1023.135211] env[68906]: ERROR nova.compute.manager [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1023.135211] env[68906]: ERROR nova.compute.manager [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1023.135211] env[68906]: ERROR nova.compute.manager [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] self._fetch_image_if_missing(context, vi) [ 1023.135211] env[68906]: ERROR nova.compute.manager [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1023.135503] env[68906]: ERROR nova.compute.manager [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] image_cache(vi, tmp_image_ds_loc) [ 1023.135503] env[68906]: ERROR nova.compute.manager [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1023.135503] env[68906]: ERROR nova.compute.manager [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] vm_util.copy_virtual_disk( [ 1023.135503] env[68906]: ERROR nova.compute.manager [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1023.135503] env[68906]: ERROR nova.compute.manager [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] session._wait_for_task(vmdk_copy_task) [ 1023.135503] env[68906]: ERROR nova.compute.manager [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1023.135503] env[68906]: ERROR nova.compute.manager [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] return self.wait_for_task(task_ref) [ 1023.135503] env[68906]: ERROR nova.compute.manager [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1023.135503] env[68906]: ERROR nova.compute.manager [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] return evt.wait() [ 1023.135503] env[68906]: ERROR nova.compute.manager [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1023.135503] env[68906]: ERROR nova.compute.manager [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] result = hub.switch() [ 1023.135503] env[68906]: ERROR nova.compute.manager [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1023.135503] env[68906]: ERROR nova.compute.manager [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] return self.greenlet.switch() [ 1023.136073] env[68906]: ERROR nova.compute.manager [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1023.136073] env[68906]: ERROR nova.compute.manager [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] self.f(*self.args, **self.kw) [ 1023.136073] env[68906]: ERROR nova.compute.manager [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1023.136073] env[68906]: ERROR nova.compute.manager [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] raise exceptions.translate_fault(task_info.error) [ 1023.136073] env[68906]: ERROR nova.compute.manager [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1023.136073] env[68906]: ERROR nova.compute.manager [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Faults: ['InvalidArgument'] [ 1023.136073] env[68906]: ERROR nova.compute.manager [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] [ 1023.136073] env[68906]: INFO nova.compute.manager [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Terminating instance [ 1023.137163] env[68906]: DEBUG oslo_concurrency.lockutils [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1023.137370] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1023.137608] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-de9bc30b-c89f-40bc-ac76-8a73342167d0 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1023.139842] env[68906]: DEBUG nova.compute.manager [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1023.140045] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1023.140749] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a6fc27b4-a730-4f9f-b4e7-62470142c4d9 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1023.147648] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Unregistering the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1023.147846] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-2643b09d-eef3-441a-8537-8b68690e4c60 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1023.149915] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1023.150098] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68906) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1023.151030] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0d5806b4-ca55-49f4-ac0e-6a16111a08a6 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1023.155343] env[68906]: DEBUG oslo_vmware.api [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Waiting for the task: (returnval){ [ 1023.155343] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]52cc9828-1c90-210c-50eb-38dabd4d6033" [ 1023.155343] env[68906]: _type = "Task" [ 1023.155343] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1023.163612] env[68906]: DEBUG oslo_vmware.api [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]52cc9828-1c90-210c-50eb-38dabd4d6033, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1023.221077] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Unregistered the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1023.221389] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Deleting contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1023.221486] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Deleting the datastore file [datastore2] d6ca51b9-b284-405c-878e-fdbc326b73e1 {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1023.221748] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-419ca5c8-20a8-4d09-a0c0-c2967e153160 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1023.228570] env[68906]: DEBUG oslo_vmware.api [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Waiting for the task: (returnval){ [ 1023.228570] env[68906]: value = "task-3475345" [ 1023.228570] env[68906]: _type = "Task" [ 1023.228570] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1023.236359] env[68906]: DEBUG oslo_vmware.api [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Task: {'id': task-3475345, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1023.665305] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Preparing fetch location {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1023.665611] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Creating directory with path [datastore2] vmware_temp/8c58a1d2-1681-46d0-b296-73023cc658b8/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1023.665789] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7ecf9b05-a65a-400b-a3ed-02f15d970e67 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1023.677522] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Created directory with path [datastore2] vmware_temp/8c58a1d2-1681-46d0-b296-73023cc658b8/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1023.677708] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Fetch image to [datastore2] vmware_temp/8c58a1d2-1681-46d0-b296-73023cc658b8/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1023.677874] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to [datastore2] vmware_temp/8c58a1d2-1681-46d0-b296-73023cc658b8/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1023.678603] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3360529-7c34-4860-8a1a-e7cf51dee6bf {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1023.685073] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-74c939b5-b45e-43e8-8c57-3c28f3dcb5b5 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1023.694170] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d7fd8f5-21fb-400d-b02f-b7ac3f7da98c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1023.725250] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f95fedbc-697d-4b56-ac6f-3bd8b41e82e4 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1023.733676] env[68906]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-a5cd95c0-f3fe-4c1e-8cce-9255734ba624 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1023.737894] env[68906]: DEBUG oslo_vmware.api [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Task: {'id': task-3475345, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.075328} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1023.738492] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Deleted the datastore file {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1023.738674] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Deleted contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1023.738867] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1023.739060] env[68906]: INFO nova.compute.manager [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1023.741415] env[68906]: DEBUG nova.compute.claims [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Aborting claim: {{(pid=68906) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1023.741415] env[68906]: DEBUG oslo_concurrency.lockutils [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1023.741540] env[68906]: DEBUG oslo_concurrency.lockutils [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1023.756525] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1023.813672] env[68906]: DEBUG oslo_vmware.rw_handles [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/8c58a1d2-1681-46d0-b296-73023cc658b8/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1023.874464] env[68906]: DEBUG oslo_vmware.rw_handles [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Completed reading data from the image iterator. {{(pid=68906) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1023.874649] env[68906]: DEBUG oslo_vmware.rw_handles [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/8c58a1d2-1681-46d0-b296-73023cc658b8/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1024.128679] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c7a8f908-7fd8-4c58-9df1-e67cb37db974 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1024.136459] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9feb3948-ce87-4ab9-8fa4-0bb4e69dca83 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1024.168296] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4998dac1-66ba-4c10-8a59-2407eadbf33f {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1024.176096] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ad2a121b-7a06-43ed-b81d-e9459287f559 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1024.189493] env[68906]: DEBUG nova.compute.provider_tree [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1024.197881] env[68906]: DEBUG nova.scheduler.client.report [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1024.211613] env[68906]: DEBUG oslo_concurrency.lockutils [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.470s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1024.212127] env[68906]: ERROR nova.compute.manager [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1024.212127] env[68906]: Faults: ['InvalidArgument'] [ 1024.212127] env[68906]: ERROR nova.compute.manager [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Traceback (most recent call last): [ 1024.212127] env[68906]: ERROR nova.compute.manager [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1024.212127] env[68906]: ERROR nova.compute.manager [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] self.driver.spawn(context, instance, image_meta, [ 1024.212127] env[68906]: ERROR nova.compute.manager [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1024.212127] env[68906]: ERROR nova.compute.manager [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1024.212127] env[68906]: ERROR nova.compute.manager [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1024.212127] env[68906]: ERROR nova.compute.manager [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] self._fetch_image_if_missing(context, vi) [ 1024.212127] env[68906]: ERROR nova.compute.manager [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1024.212127] env[68906]: ERROR nova.compute.manager [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] image_cache(vi, tmp_image_ds_loc) [ 1024.212127] env[68906]: ERROR nova.compute.manager [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1024.212404] env[68906]: ERROR nova.compute.manager [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] vm_util.copy_virtual_disk( [ 1024.212404] env[68906]: ERROR nova.compute.manager [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1024.212404] env[68906]: ERROR nova.compute.manager [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] session._wait_for_task(vmdk_copy_task) [ 1024.212404] env[68906]: ERROR nova.compute.manager [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1024.212404] env[68906]: ERROR nova.compute.manager [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] return self.wait_for_task(task_ref) [ 1024.212404] env[68906]: ERROR nova.compute.manager [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1024.212404] env[68906]: ERROR nova.compute.manager [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] return evt.wait() [ 1024.212404] env[68906]: ERROR nova.compute.manager [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1024.212404] env[68906]: ERROR nova.compute.manager [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] result = hub.switch() [ 1024.212404] env[68906]: ERROR nova.compute.manager [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1024.212404] env[68906]: ERROR nova.compute.manager [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] return self.greenlet.switch() [ 1024.212404] env[68906]: ERROR nova.compute.manager [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1024.212404] env[68906]: ERROR nova.compute.manager [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] self.f(*self.args, **self.kw) [ 1024.213048] env[68906]: ERROR nova.compute.manager [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1024.213048] env[68906]: ERROR nova.compute.manager [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] raise exceptions.translate_fault(task_info.error) [ 1024.213048] env[68906]: ERROR nova.compute.manager [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1024.213048] env[68906]: ERROR nova.compute.manager [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Faults: ['InvalidArgument'] [ 1024.213048] env[68906]: ERROR nova.compute.manager [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] [ 1024.213048] env[68906]: DEBUG nova.compute.utils [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] VimFaultException {{(pid=68906) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1024.214433] env[68906]: DEBUG nova.compute.manager [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Build of instance d6ca51b9-b284-405c-878e-fdbc326b73e1 was re-scheduled: A specified parameter was not correct: fileType [ 1024.214433] env[68906]: Faults: ['InvalidArgument'] {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1024.214808] env[68906]: DEBUG nova.compute.manager [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Unplugging VIFs for instance {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1024.214986] env[68906]: DEBUG nova.compute.manager [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1024.215228] env[68906]: DEBUG nova.compute.manager [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1024.215428] env[68906]: DEBUG nova.network.neutron [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1024.582899] env[68906]: DEBUG nova.network.neutron [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1024.594769] env[68906]: INFO nova.compute.manager [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Took 0.38 seconds to deallocate network for instance. [ 1024.689207] env[68906]: INFO nova.scheduler.client.report [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Deleted allocations for instance d6ca51b9-b284-405c-878e-fdbc326b73e1 [ 1024.711589] env[68906]: DEBUG oslo_concurrency.lockutils [None req-579f131a-2dbc-470d-9058-ec3a343d421d tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Lock "d6ca51b9-b284-405c-878e-fdbc326b73e1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 382.017s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1024.712727] env[68906]: DEBUG oslo_concurrency.lockutils [None req-018fe1e0-c886-4e1d-931e-35ad80c838d5 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Lock "d6ca51b9-b284-405c-878e-fdbc326b73e1" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 182.562s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1024.712948] env[68906]: DEBUG oslo_concurrency.lockutils [None req-018fe1e0-c886-4e1d-931e-35ad80c838d5 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Acquiring lock "d6ca51b9-b284-405c-878e-fdbc326b73e1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1024.713166] env[68906]: DEBUG oslo_concurrency.lockutils [None req-018fe1e0-c886-4e1d-931e-35ad80c838d5 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Lock "d6ca51b9-b284-405c-878e-fdbc326b73e1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1024.713334] env[68906]: DEBUG oslo_concurrency.lockutils [None req-018fe1e0-c886-4e1d-931e-35ad80c838d5 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Lock "d6ca51b9-b284-405c-878e-fdbc326b73e1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1024.715460] env[68906]: INFO nova.compute.manager [None req-018fe1e0-c886-4e1d-931e-35ad80c838d5 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Terminating instance [ 1024.717010] env[68906]: DEBUG nova.compute.manager [None req-018fe1e0-c886-4e1d-931e-35ad80c838d5 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1024.717288] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-018fe1e0-c886-4e1d-931e-35ad80c838d5 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1024.717659] env[68906]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-e0ddff1d-b942-48de-ba4c-ae25dd4decd6 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1024.722617] env[68906]: DEBUG nova.compute.manager [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1024.729125] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-90e9feb0-1982-4b2b-a0ae-18bda5e507e0 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1024.760136] env[68906]: WARNING nova.virt.vmwareapi.vmops [None req-018fe1e0-c886-4e1d-931e-35ad80c838d5 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance d6ca51b9-b284-405c-878e-fdbc326b73e1 could not be found. [ 1024.760345] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-018fe1e0-c886-4e1d-931e-35ad80c838d5 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1024.760523] env[68906]: INFO nova.compute.manager [None req-018fe1e0-c886-4e1d-931e-35ad80c838d5 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1024.760759] env[68906]: DEBUG oslo.service.loopingcall [None req-018fe1e0-c886-4e1d-931e-35ad80c838d5 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1024.760985] env[68906]: DEBUG nova.compute.manager [-] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1024.761093] env[68906]: DEBUG nova.network.neutron [-] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1024.779087] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1024.779087] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1024.779642] env[68906]: INFO nova.compute.claims [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1024.793682] env[68906]: DEBUG nova.network.neutron [-] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1024.816474] env[68906]: INFO nova.compute.manager [-] [instance: d6ca51b9-b284-405c-878e-fdbc326b73e1] Took 0.06 seconds to deallocate network for instance. [ 1024.917122] env[68906]: DEBUG oslo_concurrency.lockutils [None req-018fe1e0-c886-4e1d-931e-35ad80c838d5 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Lock "d6ca51b9-b284-405c-878e-fdbc326b73e1" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.204s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1025.145374] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c7eabb72-7e36-44f0-99b2-39aac3b2ec9a {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1025.153970] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c197e133-cd72-4c24-b343-a34b14cf4921 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1025.187968] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4a54f7f0-35ca-4b3f-91ce-5e8b4908c256 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1025.196378] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f6898634-3ed8-4a32-9f08-1cbac1fb6230 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1025.210593] env[68906]: DEBUG nova.compute.provider_tree [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1025.220288] env[68906]: DEBUG nova.scheduler.client.report [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1025.237314] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.457s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1025.237314] env[68906]: DEBUG nova.compute.manager [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Start building networks asynchronously for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1025.273026] env[68906]: DEBUG nova.compute.utils [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Using /dev/sd instead of None {{(pid=68906) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1025.273991] env[68906]: DEBUG nova.compute.manager [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Allocating IP information in the background. {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1025.274139] env[68906]: DEBUG nova.network.neutron [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] allocate_for_instance() {{(pid=68906) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1025.285565] env[68906]: DEBUG nova.compute.manager [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Start building block device mappings for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1025.359536] env[68906]: DEBUG nova.compute.manager [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Start spawning the instance on the hypervisor. {{(pid=68906) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1025.377345] env[68906]: DEBUG nova.policy [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'baacf36ec4ca4947a0d9f66b8a26295f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '47716655ec4542cd8b253d5ddff088fb', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68906) authorize /opt/stack/nova/nova/policy.py:203}} [ 1025.388334] env[68906]: DEBUG nova.virt.hardware [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T13:00:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T13:00:23Z,direct_url=,disk_format='vmdk',id=b1400c31-d33b-4e13-944f-4c645e62493e,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='1ae7bf3a375d41c6af5e7536af51ffd1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T13:00:24Z,virtual_size=,visibility=), allow threads: False {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1025.388602] env[68906]: DEBUG nova.virt.hardware [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Flavor limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1025.388761] env[68906]: DEBUG nova.virt.hardware [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Image limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1025.388963] env[68906]: DEBUG nova.virt.hardware [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Flavor pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1025.389130] env[68906]: DEBUG nova.virt.hardware [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Image pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1025.389281] env[68906]: DEBUG nova.virt.hardware [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1025.389492] env[68906]: DEBUG nova.virt.hardware [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1025.389648] env[68906]: DEBUG nova.virt.hardware [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1025.389812] env[68906]: DEBUG nova.virt.hardware [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Got 1 possible topologies {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1025.389974] env[68906]: DEBUG nova.virt.hardware [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1025.390175] env[68906]: DEBUG nova.virt.hardware [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1025.391054] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-df211ed1-d66b-4a5a-aa9e-a532c2d9eddf {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1025.399559] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d7b12503-18fd-4c5e-b237-fd0e1b3b2c66 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1026.021455] env[68906]: DEBUG nova.network.neutron [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Successfully created port: b2b5e95c-c1a3-4424-87ed-fe8468a781db {{(pid=68906) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1026.434776] env[68906]: DEBUG oslo_concurrency.lockutils [None req-67eb7be2-f488-498f-bd04-d2fc16581526 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Acquiring lock "19d8683f-32f8-48b1-960a-b91b5f82a815" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1026.434776] env[68906]: DEBUG oslo_concurrency.lockutils [None req-67eb7be2-f488-498f-bd04-d2fc16581526 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Lock "19d8683f-32f8-48b1-960a-b91b5f82a815" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1027.108292] env[68906]: DEBUG nova.network.neutron [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Successfully updated port: b2b5e95c-c1a3-4424-87ed-fe8468a781db {{(pid=68906) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1027.119650] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Acquiring lock "refresh_cache-641cca5b-d749-4331-a5e0-8acb6d47cba2" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1027.119796] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Acquired lock "refresh_cache-641cca5b-d749-4331-a5e0-8acb6d47cba2" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1027.119950] env[68906]: DEBUG nova.network.neutron [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Building network info cache for instance {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1027.201929] env[68906]: DEBUG nova.network.neutron [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Instance cache missing network info. {{(pid=68906) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1027.520832] env[68906]: DEBUG nova.network.neutron [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Updating instance_info_cache with network_info: [{"id": "b2b5e95c-c1a3-4424-87ed-fe8468a781db", "address": "fa:16:3e:e8:51:b6", "network": {"id": "0667b3c4-ef87-4878-8f3c-82e8925d0c25", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-2067406661-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "47716655ec4542cd8b253d5ddff088fb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6a2e2e51-010f-4535-ba88-433663275996", "external-id": "nsx-vlan-transportzone-915", "segmentation_id": 915, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb2b5e95c-c1", "ovs_interfaceid": "b2b5e95c-c1a3-4424-87ed-fe8468a781db", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1027.534328] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Releasing lock "refresh_cache-641cca5b-d749-4331-a5e0-8acb6d47cba2" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1027.534637] env[68906]: DEBUG nova.compute.manager [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Instance network_info: |[{"id": "b2b5e95c-c1a3-4424-87ed-fe8468a781db", "address": "fa:16:3e:e8:51:b6", "network": {"id": "0667b3c4-ef87-4878-8f3c-82e8925d0c25", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-2067406661-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "47716655ec4542cd8b253d5ddff088fb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6a2e2e51-010f-4535-ba88-433663275996", "external-id": "nsx-vlan-transportzone-915", "segmentation_id": 915, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb2b5e95c-c1", "ovs_interfaceid": "b2b5e95c-c1a3-4424-87ed-fe8468a781db", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1027.535042] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:e8:51:b6', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '6a2e2e51-010f-4535-ba88-433663275996', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'b2b5e95c-c1a3-4424-87ed-fe8468a781db', 'vif_model': 'vmxnet3'}] {{(pid=68906) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1027.542723] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Creating folder: Project (47716655ec4542cd8b253d5ddff088fb). Parent ref: group-v694750. {{(pid=68906) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1027.543247] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9ba83d4c-bb53-4355-a177-b7084dc0b463 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1027.555090] env[68906]: INFO nova.virt.vmwareapi.vm_util [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Created folder: Project (47716655ec4542cd8b253d5ddff088fb) in parent group-v694750. [ 1027.555308] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Creating folder: Instances. Parent ref: group-v694808. {{(pid=68906) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1027.555536] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-8d0affca-db69-464b-8d2b-3ad6a845f050 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1027.565024] env[68906]: INFO nova.virt.vmwareapi.vm_util [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Created folder: Instances in parent group-v694808. [ 1027.565212] env[68906]: DEBUG oslo.service.loopingcall [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1027.566201] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Creating VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1027.567368] env[68906]: DEBUG nova.compute.manager [req-0a1d2e6a-6a1f-4951-8de2-bef6b1828be8 req-86e73d57-1c54-496a-b610-491bb1e6664e service nova] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Received event network-vif-plugged-b2b5e95c-c1a3-4424-87ed-fe8468a781db {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1027.567537] env[68906]: DEBUG oslo_concurrency.lockutils [req-0a1d2e6a-6a1f-4951-8de2-bef6b1828be8 req-86e73d57-1c54-496a-b610-491bb1e6664e service nova] Acquiring lock "641cca5b-d749-4331-a5e0-8acb6d47cba2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1027.567734] env[68906]: DEBUG oslo_concurrency.lockutils [req-0a1d2e6a-6a1f-4951-8de2-bef6b1828be8 req-86e73d57-1c54-496a-b610-491bb1e6664e service nova] Lock "641cca5b-d749-4331-a5e0-8acb6d47cba2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1027.567894] env[68906]: DEBUG oslo_concurrency.lockutils [req-0a1d2e6a-6a1f-4951-8de2-bef6b1828be8 req-86e73d57-1c54-496a-b610-491bb1e6664e service nova] Lock "641cca5b-d749-4331-a5e0-8acb6d47cba2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1027.568077] env[68906]: DEBUG nova.compute.manager [req-0a1d2e6a-6a1f-4951-8de2-bef6b1828be8 req-86e73d57-1c54-496a-b610-491bb1e6664e service nova] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] No waiting events found dispatching network-vif-plugged-b2b5e95c-c1a3-4424-87ed-fe8468a781db {{(pid=68906) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1027.568245] env[68906]: WARNING nova.compute.manager [req-0a1d2e6a-6a1f-4951-8de2-bef6b1828be8 req-86e73d57-1c54-496a-b610-491bb1e6664e service nova] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Received unexpected event network-vif-plugged-b2b5e95c-c1a3-4424-87ed-fe8468a781db for instance with vm_state building and task_state spawning. [ 1027.568509] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-0c9ad84e-e64f-4e4b-8c8d-139ddcc7d737 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1027.587710] env[68906]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1027.587710] env[68906]: value = "task-3475348" [ 1027.587710] env[68906]: _type = "Task" [ 1027.587710] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1027.595299] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475348, 'name': CreateVM_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1028.098203] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475348, 'name': CreateVM_Task, 'duration_secs': 0.303422} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1028.098383] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Created VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1028.099116] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1028.099307] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1028.099603] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1028.099843] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5aee8971-fc3d-43c5-9ae3-3207fc125265 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1028.104498] env[68906]: DEBUG oslo_vmware.api [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Waiting for the task: (returnval){ [ 1028.104498] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]521acf57-6ac9-f77e-2e7a-21609ce04c2f" [ 1028.104498] env[68906]: _type = "Task" [ 1028.104498] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1028.112086] env[68906]: DEBUG oslo_vmware.api [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]521acf57-6ac9-f77e-2e7a-21609ce04c2f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1028.476103] env[68906]: DEBUG oslo_concurrency.lockutils [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Acquiring lock "1fdb401a-ac25-4418-803c-fc0b2297f2d4" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1028.476351] env[68906]: DEBUG oslo_concurrency.lockutils [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Lock "1fdb401a-ac25-4418-803c-fc0b2297f2d4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1028.615402] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1028.615643] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Processing image b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1028.615853] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1029.590299] env[68906]: DEBUG nova.compute.manager [req-2d09a260-685a-463a-af79-46e002c849bc req-1531053a-74f4-453f-abca-ca134a9254a6 service nova] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Received event network-changed-b2b5e95c-c1a3-4424-87ed-fe8468a781db {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1029.590508] env[68906]: DEBUG nova.compute.manager [req-2d09a260-685a-463a-af79-46e002c849bc req-1531053a-74f4-453f-abca-ca134a9254a6 service nova] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Refreshing instance network info cache due to event network-changed-b2b5e95c-c1a3-4424-87ed-fe8468a781db. {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1029.590715] env[68906]: DEBUG oslo_concurrency.lockutils [req-2d09a260-685a-463a-af79-46e002c849bc req-1531053a-74f4-453f-abca-ca134a9254a6 service nova] Acquiring lock "refresh_cache-641cca5b-d749-4331-a5e0-8acb6d47cba2" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1029.590856] env[68906]: DEBUG oslo_concurrency.lockutils [req-2d09a260-685a-463a-af79-46e002c849bc req-1531053a-74f4-453f-abca-ca134a9254a6 service nova] Acquired lock "refresh_cache-641cca5b-d749-4331-a5e0-8acb6d47cba2" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1029.591022] env[68906]: DEBUG nova.network.neutron [req-2d09a260-685a-463a-af79-46e002c849bc req-1531053a-74f4-453f-abca-ca134a9254a6 service nova] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Refreshing network info cache for port b2b5e95c-c1a3-4424-87ed-fe8468a781db {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1029.994873] env[68906]: DEBUG nova.network.neutron [req-2d09a260-685a-463a-af79-46e002c849bc req-1531053a-74f4-453f-abca-ca134a9254a6 service nova] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Updated VIF entry in instance network info cache for port b2b5e95c-c1a3-4424-87ed-fe8468a781db. {{(pid=68906) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1029.995254] env[68906]: DEBUG nova.network.neutron [req-2d09a260-685a-463a-af79-46e002c849bc req-1531053a-74f4-453f-abca-ca134a9254a6 service nova] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Updating instance_info_cache with network_info: [{"id": "b2b5e95c-c1a3-4424-87ed-fe8468a781db", "address": "fa:16:3e:e8:51:b6", "network": {"id": "0667b3c4-ef87-4878-8f3c-82e8925d0c25", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-2067406661-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "47716655ec4542cd8b253d5ddff088fb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6a2e2e51-010f-4535-ba88-433663275996", "external-id": "nsx-vlan-transportzone-915", "segmentation_id": 915, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb2b5e95c-c1", "ovs_interfaceid": "b2b5e95c-c1a3-4424-87ed-fe8468a781db", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1030.004752] env[68906]: DEBUG oslo_concurrency.lockutils [req-2d09a260-685a-463a-af79-46e002c849bc req-1531053a-74f4-453f-abca-ca134a9254a6 service nova] Releasing lock "refresh_cache-641cca5b-d749-4331-a5e0-8acb6d47cba2" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1033.414958] env[68906]: DEBUG oslo_concurrency.lockutils [None req-a836aaa4-8a06-4a84-954e-ba88feaacba8 tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Acquiring lock "641cca5b-d749-4331-a5e0-8acb6d47cba2" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1042.216273] env[68906]: DEBUG oslo_concurrency.lockutils [None req-ca97fdf4-ff68-4e8e-b6cb-46693d4ecb80 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Acquiring lock "f42056e5-52cb-4d69-8022-ca643c49194e" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1042.497760] env[68906]: DEBUG oslo_concurrency.lockutils [None req-f74f9aac-5e8a-4885-aea7-d641298084da tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Acquiring lock "d6be39b6-8bbc-4657-9ceb-9a4110c29c53" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1042.497999] env[68906]: DEBUG oslo_concurrency.lockutils [None req-f74f9aac-5e8a-4885-aea7-d641298084da tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Lock "d6be39b6-8bbc-4657-9ceb-9a4110c29c53" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1052.570395] env[68906]: DEBUG oslo_concurrency.lockutils [None req-dab1d23a-194f-4c67-9fbb-9bf4e98100d4 tempest-ServerDiagnosticsNegativeTest-1250564378 tempest-ServerDiagnosticsNegativeTest-1250564378-project-member] Acquiring lock "4a616d87-7b55-4b1f-b938-9d9261e8b2cd" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1052.570802] env[68906]: DEBUG oslo_concurrency.lockutils [None req-dab1d23a-194f-4c67-9fbb-9bf4e98100d4 tempest-ServerDiagnosticsNegativeTest-1250564378 tempest-ServerDiagnosticsNegativeTest-1250564378-project-member] Lock "4a616d87-7b55-4b1f-b938-9d9261e8b2cd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1053.139451] env[68906]: DEBUG oslo_concurrency.lockutils [None req-af46b527-dfe4-45d6-8cce-779746bfe2e9 tempest-ServersV294TestFqdnHostnames-2119071120 tempest-ServersV294TestFqdnHostnames-2119071120-project-member] Acquiring lock "38248e62-53b8-402e-aa29-d9a445b0af9d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1053.139611] env[68906]: DEBUG oslo_concurrency.lockutils [None req-af46b527-dfe4-45d6-8cce-779746bfe2e9 tempest-ServersV294TestFqdnHostnames-2119071120 tempest-ServersV294TestFqdnHostnames-2119071120-project-member] Lock "38248e62-53b8-402e-aa29-d9a445b0af9d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1055.140777] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1056.136097] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1057.140669] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1059.137927] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1059.140266] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1059.140498] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Starting heal instance info cache {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1059.140625] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Rebuilding the list of instances to heal {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1059.173787] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1059.173787] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1059.173787] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1059.173787] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1059.173787] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1059.174167] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1059.174167] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1059.174167] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: acc11633-a489-4d8f-ad76-f17049a91545] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1059.174167] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1059.174167] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1059.174422] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Didn't find any instances for network info cache update. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1059.174422] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1059.174422] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1059.174422] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68906) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1060.385877] env[68906]: DEBUG oslo_concurrency.lockutils [None req-b7a9f4c8-e833-4e68-ad14-0b824c77959e tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Acquiring lock "60ba9060-c3c3-4561-b9e9-e2df08e2e38b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1060.386249] env[68906]: DEBUG oslo_concurrency.lockutils [None req-b7a9f4c8-e833-4e68-ad14-0b824c77959e tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Lock "60ba9060-c3c3-4561-b9e9-e2df08e2e38b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1061.140847] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1062.140488] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1063.140606] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager.update_available_resource {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1063.152410] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1063.152642] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1063.152805] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1063.152956] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68906) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1063.154274] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0bda9f80-ebcc-4e78-9476-af08254afc4a {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1063.163649] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2481244e-f146-43b1-9b07-e9f044b369bf {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1063.177715] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2a5044f0-5a7e-4949-829d-9c106d50ae10 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1063.184007] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-70e2e2f1-2ea7-4d1f-a961-d4ae247fa0a4 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1063.213134] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180955MB free_disk=93GB free_vcpus=48 pci_devices=None {{(pid=68906) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1063.213268] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1063.213457] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1063.286449] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance f42056e5-52cb-4d69-8022-ca643c49194e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1063.286617] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance ce63789a-1f0f-40ca-8368-ac3f84bb58cd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1063.286745] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 13eebe4e-5984-46c3-bb73-cd783ad45df6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1063.286868] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 9a2d2803-34b1-40f7-9349-e5734a217e18 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1063.286989] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance a7e0a28f-42a5-442e-b962-07771d2e6a27 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1063.287129] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance eb81e9b1-b573-4d7c-9ede-f8b32a43a201 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1063.287248] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1063.287399] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance acc11633-a489-4d8f-ad76-f17049a91545 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1063.287466] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance e7286888-d79d-4632-9c06-69c1ef47fa50 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1063.287578] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 641cca5b-d749-4331-a5e0-8acb6d47cba2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1063.297912] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 6d7b4648-a12f-4c3c-8465-b8fb37eb0d3c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1063.307359] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance ad955cdc-85f1-4096-b2ec-7635d289ee57 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1063.316060] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1063.324997] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance a37ef3ce-1c29-48fe-b9c6-023da5b3db71 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1063.333522] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance ee17e223-bec7-4541-8cb2-25e4a6c32b34 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1063.342270] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 3ba4a60f-6c41-4e1e-8928-f1b95b885028 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1063.350611] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 56f936b4-680d-40db-84ab-8eb319f6ee83 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1063.358941] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance faec727e-bd92-4201-aaca-5863208be265 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1063.368540] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 18a5c392-b836-4d2a-bb77-d4af0b9fdb81 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1063.377308] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 2e6de0b1-335b-49bd-aa15-3fd4cc4b4e9e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1063.385831] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance d71bae07-54c1-427b-bfe1-2467369cd80c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1063.394415] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 682f0e61-471f-47fb-98de-02449b17d241 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1063.403885] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 4d36bb91-0cde-44cb-8706-d17740a9cf50 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1063.414919] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance db011373-7285-4882-8bce-d39cfa22fe80 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1063.424159] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 1fdb401a-ac25-4418-803c-fc0b2297f2d4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1063.432848] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance d6be39b6-8bbc-4657-9ceb-9a4110c29c53 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1063.442061] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 4a616d87-7b55-4b1f-b938-9d9261e8b2cd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1063.453857] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 38248e62-53b8-402e-aa29-d9a445b0af9d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1063.462969] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 60ba9060-c3c3-4561-b9e9-e2df08e2e38b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1063.463225] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1063.463385] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1063.793359] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a8d0258d-622c-44e5-b262-cb60c0c09ce5 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1063.800813] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3ffd28e1-0f66-4d43-a6c0-7fd4119cd8cc {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1063.830541] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aaf3368b-133d-4687-854a-f3b7d0aa48f5 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1063.837666] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-460e7d15-a927-4566-ab9e-cb61441f8e2e {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1063.851707] env[68906]: DEBUG nova.compute.provider_tree [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1063.859627] env[68906]: DEBUG nova.scheduler.client.report [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1063.872926] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68906) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1063.873123] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.660s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1071.468677] env[68906]: DEBUG oslo_concurrency.lockutils [None req-d9b622cd-9243-4719-bcad-1768eb655752 tempest-MultipleCreateTestJSON-422056473 tempest-MultipleCreateTestJSON-422056473-project-member] Acquiring lock "e8a14af6-ab4f-407e-943d-4dd3a46c8711" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1071.469745] env[68906]: DEBUG oslo_concurrency.lockutils [None req-d9b622cd-9243-4719-bcad-1768eb655752 tempest-MultipleCreateTestJSON-422056473 tempest-MultipleCreateTestJSON-422056473-project-member] Lock "e8a14af6-ab4f-407e-943d-4dd3a46c8711" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1071.499717] env[68906]: DEBUG oslo_concurrency.lockutils [None req-d9b622cd-9243-4719-bcad-1768eb655752 tempest-MultipleCreateTestJSON-422056473 tempest-MultipleCreateTestJSON-422056473-project-member] Acquiring lock "57078f52-8070-480e-b8ea-278ef759f0a3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1071.499841] env[68906]: DEBUG oslo_concurrency.lockutils [None req-d9b622cd-9243-4719-bcad-1768eb655752 tempest-MultipleCreateTestJSON-422056473 tempest-MultipleCreateTestJSON-422056473-project-member] Lock "57078f52-8070-480e-b8ea-278ef759f0a3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1072.629040] env[68906]: WARNING oslo_vmware.rw_handles [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1072.629040] env[68906]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1072.629040] env[68906]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1072.629040] env[68906]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1072.629040] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1072.629040] env[68906]: ERROR oslo_vmware.rw_handles response.begin() [ 1072.629040] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1072.629040] env[68906]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1072.629040] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1072.629040] env[68906]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1072.629040] env[68906]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1072.629040] env[68906]: ERROR oslo_vmware.rw_handles [ 1072.629639] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Downloaded image file data b1400c31-d33b-4e13-944f-4c645e62493e to vmware_temp/8c58a1d2-1681-46d0-b296-73023cc658b8/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1072.631366] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Caching image {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1072.631614] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Copying Virtual Disk [datastore2] vmware_temp/8c58a1d2-1681-46d0-b296-73023cc658b8/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk to [datastore2] vmware_temp/8c58a1d2-1681-46d0-b296-73023cc658b8/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk {{(pid=68906) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1072.631907] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-db9fd9a9-5b1e-428f-bda6-694850126d98 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1072.640580] env[68906]: DEBUG oslo_vmware.api [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Waiting for the task: (returnval){ [ 1072.640580] env[68906]: value = "task-3475349" [ 1072.640580] env[68906]: _type = "Task" [ 1072.640580] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1072.648831] env[68906]: DEBUG oslo_vmware.api [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Task: {'id': task-3475349, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1073.150083] env[68906]: DEBUG oslo_vmware.exceptions [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Fault InvalidArgument not matched. {{(pid=68906) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1073.150413] env[68906]: DEBUG oslo_concurrency.lockutils [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1073.150989] env[68906]: ERROR nova.compute.manager [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1073.150989] env[68906]: Faults: ['InvalidArgument'] [ 1073.150989] env[68906]: ERROR nova.compute.manager [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Traceback (most recent call last): [ 1073.150989] env[68906]: ERROR nova.compute.manager [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1073.150989] env[68906]: ERROR nova.compute.manager [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] yield resources [ 1073.150989] env[68906]: ERROR nova.compute.manager [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1073.150989] env[68906]: ERROR nova.compute.manager [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] self.driver.spawn(context, instance, image_meta, [ 1073.150989] env[68906]: ERROR nova.compute.manager [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1073.150989] env[68906]: ERROR nova.compute.manager [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1073.150989] env[68906]: ERROR nova.compute.manager [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1073.150989] env[68906]: ERROR nova.compute.manager [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] self._fetch_image_if_missing(context, vi) [ 1073.150989] env[68906]: ERROR nova.compute.manager [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1073.151367] env[68906]: ERROR nova.compute.manager [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] image_cache(vi, tmp_image_ds_loc) [ 1073.151367] env[68906]: ERROR nova.compute.manager [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1073.151367] env[68906]: ERROR nova.compute.manager [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] vm_util.copy_virtual_disk( [ 1073.151367] env[68906]: ERROR nova.compute.manager [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1073.151367] env[68906]: ERROR nova.compute.manager [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] session._wait_for_task(vmdk_copy_task) [ 1073.151367] env[68906]: ERROR nova.compute.manager [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1073.151367] env[68906]: ERROR nova.compute.manager [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] return self.wait_for_task(task_ref) [ 1073.151367] env[68906]: ERROR nova.compute.manager [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1073.151367] env[68906]: ERROR nova.compute.manager [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] return evt.wait() [ 1073.151367] env[68906]: ERROR nova.compute.manager [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1073.151367] env[68906]: ERROR nova.compute.manager [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] result = hub.switch() [ 1073.151367] env[68906]: ERROR nova.compute.manager [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1073.151367] env[68906]: ERROR nova.compute.manager [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] return self.greenlet.switch() [ 1073.151713] env[68906]: ERROR nova.compute.manager [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1073.151713] env[68906]: ERROR nova.compute.manager [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] self.f(*self.args, **self.kw) [ 1073.151713] env[68906]: ERROR nova.compute.manager [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1073.151713] env[68906]: ERROR nova.compute.manager [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] raise exceptions.translate_fault(task_info.error) [ 1073.151713] env[68906]: ERROR nova.compute.manager [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1073.151713] env[68906]: ERROR nova.compute.manager [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Faults: ['InvalidArgument'] [ 1073.151713] env[68906]: ERROR nova.compute.manager [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] [ 1073.151713] env[68906]: INFO nova.compute.manager [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Terminating instance [ 1073.152835] env[68906]: DEBUG oslo_concurrency.lockutils [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1073.153052] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1073.153292] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-305711cc-9e11-4267-a93c-ea24b039c9f9 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1073.155603] env[68906]: DEBUG nova.compute.manager [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1073.155792] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1073.156500] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c6e48e8-4df0-415c-bbcc-216e0da4a35a {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1073.163327] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Unregistering the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1073.163487] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-39080c65-987c-42a8-b855-6b65c4559dd1 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1073.165590] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1073.165768] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68906) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1073.166734] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2e0af97a-6450-4162-87ba-71b36e6f7b04 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1073.172210] env[68906]: DEBUG oslo_vmware.api [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Waiting for the task: (returnval){ [ 1073.172210] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]5238c2c8-e747-0974-7337-1e366cc1903a" [ 1073.172210] env[68906]: _type = "Task" [ 1073.172210] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1073.179975] env[68906]: DEBUG oslo_vmware.api [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]5238c2c8-e747-0974-7337-1e366cc1903a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1073.240256] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Unregistered the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1073.240484] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Deleting contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1073.240662] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Deleting the datastore file [datastore2] ce63789a-1f0f-40ca-8368-ac3f84bb58cd {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1073.240950] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-83fe696b-ed03-46a7-a359-f44d1af95d84 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1073.246888] env[68906]: DEBUG oslo_vmware.api [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Waiting for the task: (returnval){ [ 1073.246888] env[68906]: value = "task-3475351" [ 1073.246888] env[68906]: _type = "Task" [ 1073.246888] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1073.254708] env[68906]: DEBUG oslo_vmware.api [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Task: {'id': task-3475351, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1073.683093] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Preparing fetch location {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1073.683404] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Creating directory with path [datastore2] vmware_temp/50995ed0-6bf4-4dff-a727-c43c53444392/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1073.683612] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3139dfcb-1f8d-486e-9938-87cbe9719061 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1073.694899] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Created directory with path [datastore2] vmware_temp/50995ed0-6bf4-4dff-a727-c43c53444392/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1073.695106] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Fetch image to [datastore2] vmware_temp/50995ed0-6bf4-4dff-a727-c43c53444392/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1073.695282] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to [datastore2] vmware_temp/50995ed0-6bf4-4dff-a727-c43c53444392/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1073.695992] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c99095ec-1c90-4294-a731-541132076faf {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1073.702753] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2e8e6e2f-7382-4c0c-9e15-bfd3d249a3fd {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1073.711602] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d2083c31-cc7d-435e-baeb-640e163c68f8 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1073.744227] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-41a8c896-3233-4cbe-8790-84dd8e6989b6 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1073.752901] env[68906]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-400cd298-ac18-47e8-ad07-daf987b8652c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1073.757230] env[68906]: DEBUG oslo_vmware.api [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Task: {'id': task-3475351, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.072419} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1073.757766] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Deleted the datastore file {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1073.758483] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Deleted contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1073.758483] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1073.758483] env[68906]: INFO nova.compute.manager [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1073.761462] env[68906]: DEBUG nova.compute.claims [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Aborting claim: {{(pid=68906) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1073.761636] env[68906]: DEBUG oslo_concurrency.lockutils [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1073.761847] env[68906]: DEBUG oslo_concurrency.lockutils [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1073.776646] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1073.831505] env[68906]: DEBUG oslo_vmware.rw_handles [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/50995ed0-6bf4-4dff-a727-c43c53444392/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1073.893216] env[68906]: DEBUG oslo_vmware.rw_handles [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Completed reading data from the image iterator. {{(pid=68906) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1073.893515] env[68906]: DEBUG oslo_vmware.rw_handles [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/50995ed0-6bf4-4dff-a727-c43c53444392/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1074.254104] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4689b6b9-d4d7-4cc2-8e70-bf894812194d {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1074.261630] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c244bfa5-14b0-4f25-82e6-7ed698b52f8b {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1074.292252] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a8a2eeaa-9383-4cac-98b7-97eac4d3d26b {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1074.299722] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5acd77c1-1c1c-4351-8cbf-0fe7fc79994f {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1074.312892] env[68906]: DEBUG nova.compute.provider_tree [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1074.321434] env[68906]: DEBUG nova.scheduler.client.report [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1074.341760] env[68906]: DEBUG oslo_concurrency.lockutils [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.579s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1074.341760] env[68906]: ERROR nova.compute.manager [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1074.341760] env[68906]: Faults: ['InvalidArgument'] [ 1074.341760] env[68906]: ERROR nova.compute.manager [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Traceback (most recent call last): [ 1074.341760] env[68906]: ERROR nova.compute.manager [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1074.341760] env[68906]: ERROR nova.compute.manager [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] self.driver.spawn(context, instance, image_meta, [ 1074.341760] env[68906]: ERROR nova.compute.manager [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1074.341760] env[68906]: ERROR nova.compute.manager [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1074.341760] env[68906]: ERROR nova.compute.manager [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1074.341760] env[68906]: ERROR nova.compute.manager [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] self._fetch_image_if_missing(context, vi) [ 1074.342117] env[68906]: ERROR nova.compute.manager [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1074.342117] env[68906]: ERROR nova.compute.manager [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] image_cache(vi, tmp_image_ds_loc) [ 1074.342117] env[68906]: ERROR nova.compute.manager [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1074.342117] env[68906]: ERROR nova.compute.manager [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] vm_util.copy_virtual_disk( [ 1074.342117] env[68906]: ERROR nova.compute.manager [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1074.342117] env[68906]: ERROR nova.compute.manager [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] session._wait_for_task(vmdk_copy_task) [ 1074.342117] env[68906]: ERROR nova.compute.manager [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1074.342117] env[68906]: ERROR nova.compute.manager [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] return self.wait_for_task(task_ref) [ 1074.342117] env[68906]: ERROR nova.compute.manager [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1074.342117] env[68906]: ERROR nova.compute.manager [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] return evt.wait() [ 1074.342117] env[68906]: ERROR nova.compute.manager [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1074.342117] env[68906]: ERROR nova.compute.manager [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] result = hub.switch() [ 1074.342117] env[68906]: ERROR nova.compute.manager [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1074.342423] env[68906]: ERROR nova.compute.manager [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] return self.greenlet.switch() [ 1074.342423] env[68906]: ERROR nova.compute.manager [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1074.342423] env[68906]: ERROR nova.compute.manager [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] self.f(*self.args, **self.kw) [ 1074.342423] env[68906]: ERROR nova.compute.manager [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1074.342423] env[68906]: ERROR nova.compute.manager [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] raise exceptions.translate_fault(task_info.error) [ 1074.342423] env[68906]: ERROR nova.compute.manager [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1074.342423] env[68906]: ERROR nova.compute.manager [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Faults: ['InvalidArgument'] [ 1074.342423] env[68906]: ERROR nova.compute.manager [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] [ 1074.342423] env[68906]: DEBUG nova.compute.utils [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] VimFaultException {{(pid=68906) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1074.344029] env[68906]: DEBUG nova.compute.manager [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Build of instance ce63789a-1f0f-40ca-8368-ac3f84bb58cd was re-scheduled: A specified parameter was not correct: fileType [ 1074.344029] env[68906]: Faults: ['InvalidArgument'] {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1074.345722] env[68906]: DEBUG nova.compute.manager [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Unplugging VIFs for instance {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1074.345722] env[68906]: DEBUG nova.compute.manager [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1074.345722] env[68906]: DEBUG nova.compute.manager [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1074.345722] env[68906]: DEBUG nova.network.neutron [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1075.234307] env[68906]: DEBUG nova.network.neutron [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1075.252839] env[68906]: INFO nova.compute.manager [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Took 0.91 seconds to deallocate network for instance. [ 1075.389521] env[68906]: INFO nova.scheduler.client.report [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Deleted allocations for instance ce63789a-1f0f-40ca-8368-ac3f84bb58cd [ 1075.425502] env[68906]: DEBUG oslo_concurrency.lockutils [None req-25b03b1c-8272-403d-9ff2-a1dba7656508 tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Lock "ce63789a-1f0f-40ca-8368-ac3f84bb58cd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 429.324s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1075.426255] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8567c8ff-035a-4e6a-9e89-f6bf5c0abf8d tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Lock "ce63789a-1f0f-40ca-8368-ac3f84bb58cd" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 230.469s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1075.426638] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8567c8ff-035a-4e6a-9e89-f6bf5c0abf8d tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Acquiring lock "ce63789a-1f0f-40ca-8368-ac3f84bb58cd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1075.427270] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8567c8ff-035a-4e6a-9e89-f6bf5c0abf8d tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Lock "ce63789a-1f0f-40ca-8368-ac3f84bb58cd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1075.427702] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8567c8ff-035a-4e6a-9e89-f6bf5c0abf8d tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Lock "ce63789a-1f0f-40ca-8368-ac3f84bb58cd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.001s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1075.431015] env[68906]: INFO nova.compute.manager [None req-8567c8ff-035a-4e6a-9e89-f6bf5c0abf8d tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Terminating instance [ 1075.431842] env[68906]: DEBUG nova.compute.manager [None req-8567c8ff-035a-4e6a-9e89-f6bf5c0abf8d tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1075.432368] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-8567c8ff-035a-4e6a-9e89-f6bf5c0abf8d tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1075.433198] env[68906]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-0927d3ac-2fce-4e46-bbcc-0a73cef5822a {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1075.442036] env[68906]: DEBUG nova.compute.manager [None req-d970c352-17f4-4e98-8d45-165ae6d79067 tempest-ServersTestMultiNic-1243959320 tempest-ServersTestMultiNic-1243959320-project-member] [instance: 6d7b4648-a12f-4c3c-8465-b8fb37eb0d3c] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1075.447853] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f8ab563-c5a1-48cb-8055-fc6d3af0f4fe {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1075.472038] env[68906]: DEBUG nova.compute.manager [None req-d970c352-17f4-4e98-8d45-165ae6d79067 tempest-ServersTestMultiNic-1243959320 tempest-ServersTestMultiNic-1243959320-project-member] [instance: 6d7b4648-a12f-4c3c-8465-b8fb37eb0d3c] Instance disappeared before build. {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1075.483044] env[68906]: WARNING nova.virt.vmwareapi.vmops [None req-8567c8ff-035a-4e6a-9e89-f6bf5c0abf8d tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance ce63789a-1f0f-40ca-8368-ac3f84bb58cd could not be found. [ 1075.483548] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-8567c8ff-035a-4e6a-9e89-f6bf5c0abf8d tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1075.483840] env[68906]: INFO nova.compute.manager [None req-8567c8ff-035a-4e6a-9e89-f6bf5c0abf8d tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1075.484255] env[68906]: DEBUG oslo.service.loopingcall [None req-8567c8ff-035a-4e6a-9e89-f6bf5c0abf8d tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1075.484916] env[68906]: DEBUG nova.compute.manager [-] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1075.485174] env[68906]: DEBUG nova.network.neutron [-] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1075.499145] env[68906]: DEBUG oslo_concurrency.lockutils [None req-d970c352-17f4-4e98-8d45-165ae6d79067 tempest-ServersTestMultiNic-1243959320 tempest-ServersTestMultiNic-1243959320-project-member] Lock "6d7b4648-a12f-4c3c-8465-b8fb37eb0d3c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 208.526s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1075.513548] env[68906]: DEBUG nova.compute.manager [None req-fe2cd7df-ed0e-48a4-b92a-bb5f48ec790c tempest-InstanceActionsV221TestJSON-740479432 tempest-InstanceActionsV221TestJSON-740479432-project-member] [instance: ad955cdc-85f1-4096-b2ec-7635d289ee57] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1075.524881] env[68906]: DEBUG nova.network.neutron [-] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1075.542016] env[68906]: DEBUG nova.compute.manager [None req-fe2cd7df-ed0e-48a4-b92a-bb5f48ec790c tempest-InstanceActionsV221TestJSON-740479432 tempest-InstanceActionsV221TestJSON-740479432-project-member] [instance: ad955cdc-85f1-4096-b2ec-7635d289ee57] Instance disappeared before build. {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1075.542016] env[68906]: INFO nova.compute.manager [-] [instance: ce63789a-1f0f-40ca-8368-ac3f84bb58cd] Took 0.06 seconds to deallocate network for instance. [ 1075.563322] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fe2cd7df-ed0e-48a4-b92a-bb5f48ec790c tempest-InstanceActionsV221TestJSON-740479432 tempest-InstanceActionsV221TestJSON-740479432-project-member] Lock "ad955cdc-85f1-4096-b2ec-7635d289ee57" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 198.986s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1075.580718] env[68906]: DEBUG nova.compute.manager [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1075.640211] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8567c8ff-035a-4e6a-9e89-f6bf5c0abf8d tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Lock "ce63789a-1f0f-40ca-8368-ac3f84bb58cd" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.214s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1075.642082] env[68906]: DEBUG oslo_concurrency.lockutils [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1075.642531] env[68906]: DEBUG oslo_concurrency.lockutils [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1075.644250] env[68906]: INFO nova.compute.claims [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1076.070306] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29b996ff-c987-4bd6-865d-f156289856b1 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1076.078294] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e7ed7b9-51e9-422f-9f04-6d721ade1ec2 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1076.107728] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e9141dbb-0e62-49cd-b9eb-e92e5b4318ec {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1076.115696] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f4987c4-6c9e-4c2f-a580-4696e8e0affd {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1076.129106] env[68906]: DEBUG nova.compute.provider_tree [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1076.138720] env[68906]: DEBUG nova.scheduler.client.report [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1076.153206] env[68906]: DEBUG oslo_concurrency.lockutils [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.511s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1076.153715] env[68906]: DEBUG nova.compute.manager [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Start building networks asynchronously for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1076.212184] env[68906]: DEBUG nova.compute.utils [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Using /dev/sd instead of None {{(pid=68906) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1076.214470] env[68906]: DEBUG nova.compute.manager [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Allocating IP information in the background. {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1076.214470] env[68906]: DEBUG nova.network.neutron [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] allocate_for_instance() {{(pid=68906) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1076.231208] env[68906]: DEBUG nova.compute.manager [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Start building block device mappings for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1076.325031] env[68906]: DEBUG nova.compute.manager [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Start spawning the instance on the hypervisor. {{(pid=68906) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1076.344272] env[68906]: DEBUG nova.policy [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6c408aebe1974cd18f7a7dc3653c9f82', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3a389e5a7ac545cc9453d43ff02db91a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68906) authorize /opt/stack/nova/nova/policy.py:203}} [ 1076.357161] env[68906]: DEBUG nova.virt.hardware [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T13:00:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T13:00:23Z,direct_url=,disk_format='vmdk',id=b1400c31-d33b-4e13-944f-4c645e62493e,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='1ae7bf3a375d41c6af5e7536af51ffd1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T13:00:24Z,virtual_size=,visibility=), allow threads: False {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1076.357465] env[68906]: DEBUG nova.virt.hardware [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Flavor limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1076.357625] env[68906]: DEBUG nova.virt.hardware [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Image limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1076.357805] env[68906]: DEBUG nova.virt.hardware [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Flavor pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1076.357953] env[68906]: DEBUG nova.virt.hardware [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Image pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1076.358217] env[68906]: DEBUG nova.virt.hardware [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1076.358451] env[68906]: DEBUG nova.virt.hardware [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1076.358810] env[68906]: DEBUG nova.virt.hardware [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1076.359014] env[68906]: DEBUG nova.virt.hardware [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Got 1 possible topologies {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1076.359327] env[68906]: DEBUG nova.virt.hardware [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1076.359525] env[68906]: DEBUG nova.virt.hardware [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1076.360419] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-271392a7-fe34-4755-adf5-9167cca2e9c6 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1076.368758] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-52b376cb-9430-4883-a666-feba743c4c8c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1076.403751] env[68906]: DEBUG oslo_concurrency.lockutils [None req-68edea35-7d38-466f-87ca-34c780500d3b tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Acquiring lock "9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1077.115433] env[68906]: DEBUG nova.network.neutron [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Successfully created port: 7c634ecb-7b48-40e4-a5dc-3ee4546e9a5b {{(pid=68906) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1078.216085] env[68906]: DEBUG nova.compute.manager [req-507d47cd-36cc-42a6-b9b3-88e34f59cf2d req-6ee9926d-7059-4ca5-9516-4999f89b96fe service nova] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Received event network-vif-plugged-7c634ecb-7b48-40e4-a5dc-3ee4546e9a5b {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1078.216085] env[68906]: DEBUG oslo_concurrency.lockutils [req-507d47cd-36cc-42a6-b9b3-88e34f59cf2d req-6ee9926d-7059-4ca5-9516-4999f89b96fe service nova] Acquiring lock "9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1078.216332] env[68906]: DEBUG oslo_concurrency.lockutils [req-507d47cd-36cc-42a6-b9b3-88e34f59cf2d req-6ee9926d-7059-4ca5-9516-4999f89b96fe service nova] Lock "9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1078.216493] env[68906]: DEBUG oslo_concurrency.lockutils [req-507d47cd-36cc-42a6-b9b3-88e34f59cf2d req-6ee9926d-7059-4ca5-9516-4999f89b96fe service nova] Lock "9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1078.216657] env[68906]: DEBUG nova.compute.manager [req-507d47cd-36cc-42a6-b9b3-88e34f59cf2d req-6ee9926d-7059-4ca5-9516-4999f89b96fe service nova] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] No waiting events found dispatching network-vif-plugged-7c634ecb-7b48-40e4-a5dc-3ee4546e9a5b {{(pid=68906) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1078.216817] env[68906]: WARNING nova.compute.manager [req-507d47cd-36cc-42a6-b9b3-88e34f59cf2d req-6ee9926d-7059-4ca5-9516-4999f89b96fe service nova] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Received unexpected event network-vif-plugged-7c634ecb-7b48-40e4-a5dc-3ee4546e9a5b for instance with vm_state building and task_state deleting. [ 1078.295900] env[68906]: DEBUG nova.network.neutron [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Successfully updated port: 7c634ecb-7b48-40e4-a5dc-3ee4546e9a5b {{(pid=68906) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1078.310497] env[68906]: DEBUG oslo_concurrency.lockutils [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Acquiring lock "refresh_cache-9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1078.310615] env[68906]: DEBUG oslo_concurrency.lockutils [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Acquired lock "refresh_cache-9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1078.310670] env[68906]: DEBUG nova.network.neutron [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Building network info cache for instance {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1078.410327] env[68906]: DEBUG nova.network.neutron [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Instance cache missing network info. {{(pid=68906) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1078.758506] env[68906]: DEBUG nova.network.neutron [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Updating instance_info_cache with network_info: [{"id": "7c634ecb-7b48-40e4-a5dc-3ee4546e9a5b", "address": "fa:16:3e:c9:54:66", "network": {"id": "46f600d2-4fb3-4dde-ba66-1d2483f65c05", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1455035795-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3a389e5a7ac545cc9453d43ff02db91a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0a3f99df-d1bc-4a37-a048-263445d4a7b0", "external-id": "nsx-vlan-transportzone-374", "segmentation_id": 374, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7c634ecb-7b", "ovs_interfaceid": "7c634ecb-7b48-40e4-a5dc-3ee4546e9a5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1078.776504] env[68906]: DEBUG oslo_concurrency.lockutils [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Releasing lock "refresh_cache-9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1078.776837] env[68906]: DEBUG nova.compute.manager [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Instance network_info: |[{"id": "7c634ecb-7b48-40e4-a5dc-3ee4546e9a5b", "address": "fa:16:3e:c9:54:66", "network": {"id": "46f600d2-4fb3-4dde-ba66-1d2483f65c05", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1455035795-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3a389e5a7ac545cc9453d43ff02db91a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0a3f99df-d1bc-4a37-a048-263445d4a7b0", "external-id": "nsx-vlan-transportzone-374", "segmentation_id": 374, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7c634ecb-7b", "ovs_interfaceid": "7c634ecb-7b48-40e4-a5dc-3ee4546e9a5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1078.777250] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:c9:54:66', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '0a3f99df-d1bc-4a37-a048-263445d4a7b0', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '7c634ecb-7b48-40e4-a5dc-3ee4546e9a5b', 'vif_model': 'vmxnet3'}] {{(pid=68906) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1078.788104] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Creating folder: Project (3a389e5a7ac545cc9453d43ff02db91a). Parent ref: group-v694750. {{(pid=68906) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1078.788705] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-ea48da16-44a2-408e-8129-4ca9baf92916 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1078.801764] env[68906]: INFO nova.virt.vmwareapi.vm_util [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Created folder: Project (3a389e5a7ac545cc9453d43ff02db91a) in parent group-v694750. [ 1078.801764] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Creating folder: Instances. Parent ref: group-v694811. {{(pid=68906) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1078.802107] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b66e01d7-7f30-41c0-a615-e89a3bb84cd9 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1078.810991] env[68906]: INFO nova.virt.vmwareapi.vm_util [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Created folder: Instances in parent group-v694811. [ 1078.811426] env[68906]: DEBUG oslo.service.loopingcall [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1078.811529] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Creating VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1078.811677] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-422bfb22-c450-4e36-a1dc-dda6967c8c13 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1078.834466] env[68906]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1078.834466] env[68906]: value = "task-3475354" [ 1078.834466] env[68906]: _type = "Task" [ 1078.834466] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1078.842408] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475354, 'name': CreateVM_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1079.343909] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475354, 'name': CreateVM_Task, 'duration_secs': 0.340464} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1079.344176] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Created VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1079.344677] env[68906]: DEBUG oslo_concurrency.lockutils [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1079.344841] env[68906]: DEBUG oslo_concurrency.lockutils [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1079.345170] env[68906]: DEBUG oslo_concurrency.lockutils [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1079.345428] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6a976e96-880e-4d90-ad67-56ab974a45de {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1079.349943] env[68906]: DEBUG oslo_vmware.api [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Waiting for the task: (returnval){ [ 1079.349943] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]529f801c-4054-79fe-fa09-c738d401d19f" [ 1079.349943] env[68906]: _type = "Task" [ 1079.349943] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1079.357895] env[68906]: DEBUG oslo_vmware.api [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]529f801c-4054-79fe-fa09-c738d401d19f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1079.864149] env[68906]: DEBUG oslo_concurrency.lockutils [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1079.864446] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Processing image b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1079.864656] env[68906]: DEBUG oslo_concurrency.lockutils [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1080.312335] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Acquiring lock "6c28f571-e74a-48f9-9cc7-a9e4ddea8b09" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1080.313057] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Lock "6c28f571-e74a-48f9-9cc7-a9e4ddea8b09" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1080.557060] env[68906]: DEBUG nova.compute.manager [req-3c93345b-c553-408a-a889-5c0921f01e8a req-dea0b12f-b16c-42b6-8b85-2e7d3840c1e7 service nova] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Received event network-changed-7c634ecb-7b48-40e4-a5dc-3ee4546e9a5b {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1080.557318] env[68906]: DEBUG nova.compute.manager [req-3c93345b-c553-408a-a889-5c0921f01e8a req-dea0b12f-b16c-42b6-8b85-2e7d3840c1e7 service nova] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Refreshing instance network info cache due to event network-changed-7c634ecb-7b48-40e4-a5dc-3ee4546e9a5b. {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1080.557527] env[68906]: DEBUG oslo_concurrency.lockutils [req-3c93345b-c553-408a-a889-5c0921f01e8a req-dea0b12f-b16c-42b6-8b85-2e7d3840c1e7 service nova] Acquiring lock "refresh_cache-9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1080.557614] env[68906]: DEBUG oslo_concurrency.lockutils [req-3c93345b-c553-408a-a889-5c0921f01e8a req-dea0b12f-b16c-42b6-8b85-2e7d3840c1e7 service nova] Acquired lock "refresh_cache-9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1080.557768] env[68906]: DEBUG nova.network.neutron [req-3c93345b-c553-408a-a889-5c0921f01e8a req-dea0b12f-b16c-42b6-8b85-2e7d3840c1e7 service nova] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Refreshing network info cache for port 7c634ecb-7b48-40e4-a5dc-3ee4546e9a5b {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1080.945988] env[68906]: DEBUG nova.network.neutron [req-3c93345b-c553-408a-a889-5c0921f01e8a req-dea0b12f-b16c-42b6-8b85-2e7d3840c1e7 service nova] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Updated VIF entry in instance network info cache for port 7c634ecb-7b48-40e4-a5dc-3ee4546e9a5b. {{(pid=68906) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1080.946402] env[68906]: DEBUG nova.network.neutron [req-3c93345b-c553-408a-a889-5c0921f01e8a req-dea0b12f-b16c-42b6-8b85-2e7d3840c1e7 service nova] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Updating instance_info_cache with network_info: [{"id": "7c634ecb-7b48-40e4-a5dc-3ee4546e9a5b", "address": "fa:16:3e:c9:54:66", "network": {"id": "46f600d2-4fb3-4dde-ba66-1d2483f65c05", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-1455035795-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3a389e5a7ac545cc9453d43ff02db91a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0a3f99df-d1bc-4a37-a048-263445d4a7b0", "external-id": "nsx-vlan-transportzone-374", "segmentation_id": 374, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7c634ecb-7b", "ovs_interfaceid": "7c634ecb-7b48-40e4-a5dc-3ee4546e9a5b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1080.956619] env[68906]: DEBUG oslo_concurrency.lockutils [req-3c93345b-c553-408a-a889-5c0921f01e8a req-dea0b12f-b16c-42b6-8b85-2e7d3840c1e7 service nova] Releasing lock "refresh_cache-9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1086.691070] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Acquiring lock "e0e595e3-e47e-4cf1-8977-f004eca942d1" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1086.693511] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Lock "e0e595e3-e47e-4cf1-8977-f004eca942d1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1090.622633] env[68906]: DEBUG oslo_concurrency.lockutils [None req-1a3cd471-afd5-4758-9ba7-114ed58755e9 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Acquiring lock "3c36e8a4-da45-457e-b4ef-001f4a4e595f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1090.622952] env[68906]: DEBUG oslo_concurrency.lockutils [None req-1a3cd471-afd5-4758-9ba7-114ed58755e9 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Lock "3c36e8a4-da45-457e-b4ef-001f4a4e595f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1091.608844] env[68906]: DEBUG oslo_concurrency.lockutils [None req-7e5b0fff-7692-45bc-b660-9f08afcd6b69 tempest-AttachVolumeShelveTestJSON-1059946953 tempest-AttachVolumeShelveTestJSON-1059946953-project-member] Acquiring lock "582a086e-5122-41f2-8fb8-513b3734eef4" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1091.609100] env[68906]: DEBUG oslo_concurrency.lockutils [None req-7e5b0fff-7692-45bc-b660-9f08afcd6b69 tempest-AttachVolumeShelveTestJSON-1059946953 tempest-AttachVolumeShelveTestJSON-1059946953-project-member] Lock "582a086e-5122-41f2-8fb8-513b3734eef4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1091.611430] env[68906]: DEBUG oslo_concurrency.lockutils [None req-b0388411-7fe1-4e49-b7da-6e4027223a15 tempest-ServerRescueTestJSON-1075537064 tempest-ServerRescueTestJSON-1075537064-project-member] Acquiring lock "159edc16-55bb-46eb-8fa9-7da7c1f36cd0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1091.611723] env[68906]: DEBUG oslo_concurrency.lockutils [None req-b0388411-7fe1-4e49-b7da-6e4027223a15 tempest-ServerRescueTestJSON-1075537064 tempest-ServerRescueTestJSON-1075537064-project-member] Lock "159edc16-55bb-46eb-8fa9-7da7c1f36cd0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1094.685991] env[68906]: DEBUG oslo_concurrency.lockutils [None req-0b200139-c804-48cc-b35b-ce4dd8cb7f66 tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] Acquiring lock "13b471c5-c86e-4b55-a231-159b2219de2f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1094.686302] env[68906]: DEBUG oslo_concurrency.lockutils [None req-0b200139-c804-48cc-b35b-ce4dd8cb7f66 tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] Lock "13b471c5-c86e-4b55-a231-159b2219de2f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1102.341601] env[68906]: DEBUG oslo_concurrency.lockutils [None req-4c548662-eee4-4e29-93f1-9fe857d3d075 tempest-ServerRescueTestJSONUnderV235-1338611176 tempest-ServerRescueTestJSONUnderV235-1338611176-project-member] Acquiring lock "d01b8b11-bc3b-47dc-8687-a111c1453ed9" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1102.341601] env[68906]: DEBUG oslo_concurrency.lockutils [None req-4c548662-eee4-4e29-93f1-9fe857d3d075 tempest-ServerRescueTestJSONUnderV235-1338611176 tempest-ServerRescueTestJSONUnderV235-1338611176-project-member] Lock "d01b8b11-bc3b-47dc-8687-a111c1453ed9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1115.873963] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1117.141517] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1119.142644] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1119.143027] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1119.143083] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68906) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1120.136806] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1120.140442] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1120.140606] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Starting heal instance info cache {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1120.140730] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Rebuilding the list of instances to heal {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1120.170016] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1120.170260] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1120.170303] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1120.170426] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1120.170553] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1120.170676] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1120.170799] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: acc11633-a489-4d8f-ad76-f17049a91545] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1120.170918] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1120.171048] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1120.171171] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1120.171291] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Didn't find any instances for network info cache update. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1120.639303] env[68906]: WARNING oslo_vmware.rw_handles [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1120.639303] env[68906]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1120.639303] env[68906]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1120.639303] env[68906]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1120.639303] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1120.639303] env[68906]: ERROR oslo_vmware.rw_handles response.begin() [ 1120.639303] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1120.639303] env[68906]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1120.639303] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1120.639303] env[68906]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1120.639303] env[68906]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1120.639303] env[68906]: ERROR oslo_vmware.rw_handles [ 1120.639303] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Downloaded image file data b1400c31-d33b-4e13-944f-4c645e62493e to vmware_temp/50995ed0-6bf4-4dff-a727-c43c53444392/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1120.640836] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Caching image {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1120.641172] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Copying Virtual Disk [datastore2] vmware_temp/50995ed0-6bf4-4dff-a727-c43c53444392/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk to [datastore2] vmware_temp/50995ed0-6bf4-4dff-a727-c43c53444392/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk {{(pid=68906) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1120.641515] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-337f1c43-e894-45f2-a200-03a3a6b74bbd {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1120.649406] env[68906]: DEBUG oslo_vmware.api [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Waiting for the task: (returnval){ [ 1120.649406] env[68906]: value = "task-3475355" [ 1120.649406] env[68906]: _type = "Task" [ 1120.649406] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1120.657033] env[68906]: DEBUG oslo_vmware.api [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Task: {'id': task-3475355, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1121.160777] env[68906]: DEBUG oslo_vmware.exceptions [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Fault InvalidArgument not matched. {{(pid=68906) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1121.164016] env[68906]: DEBUG oslo_concurrency.lockutils [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1121.164016] env[68906]: ERROR nova.compute.manager [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1121.164016] env[68906]: Faults: ['InvalidArgument'] [ 1121.164016] env[68906]: ERROR nova.compute.manager [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Traceback (most recent call last): [ 1121.164016] env[68906]: ERROR nova.compute.manager [instance: f42056e5-52cb-4d69-8022-ca643c49194e] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1121.164016] env[68906]: ERROR nova.compute.manager [instance: f42056e5-52cb-4d69-8022-ca643c49194e] yield resources [ 1121.164016] env[68906]: ERROR nova.compute.manager [instance: f42056e5-52cb-4d69-8022-ca643c49194e] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1121.164016] env[68906]: ERROR nova.compute.manager [instance: f42056e5-52cb-4d69-8022-ca643c49194e] self.driver.spawn(context, instance, image_meta, [ 1121.164016] env[68906]: ERROR nova.compute.manager [instance: f42056e5-52cb-4d69-8022-ca643c49194e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1121.164016] env[68906]: ERROR nova.compute.manager [instance: f42056e5-52cb-4d69-8022-ca643c49194e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1121.164380] env[68906]: ERROR nova.compute.manager [instance: f42056e5-52cb-4d69-8022-ca643c49194e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1121.164380] env[68906]: ERROR nova.compute.manager [instance: f42056e5-52cb-4d69-8022-ca643c49194e] self._fetch_image_if_missing(context, vi) [ 1121.164380] env[68906]: ERROR nova.compute.manager [instance: f42056e5-52cb-4d69-8022-ca643c49194e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1121.164380] env[68906]: ERROR nova.compute.manager [instance: f42056e5-52cb-4d69-8022-ca643c49194e] image_cache(vi, tmp_image_ds_loc) [ 1121.164380] env[68906]: ERROR nova.compute.manager [instance: f42056e5-52cb-4d69-8022-ca643c49194e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1121.164380] env[68906]: ERROR nova.compute.manager [instance: f42056e5-52cb-4d69-8022-ca643c49194e] vm_util.copy_virtual_disk( [ 1121.164380] env[68906]: ERROR nova.compute.manager [instance: f42056e5-52cb-4d69-8022-ca643c49194e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1121.164380] env[68906]: ERROR nova.compute.manager [instance: f42056e5-52cb-4d69-8022-ca643c49194e] session._wait_for_task(vmdk_copy_task) [ 1121.164380] env[68906]: ERROR nova.compute.manager [instance: f42056e5-52cb-4d69-8022-ca643c49194e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1121.164380] env[68906]: ERROR nova.compute.manager [instance: f42056e5-52cb-4d69-8022-ca643c49194e] return self.wait_for_task(task_ref) [ 1121.164380] env[68906]: ERROR nova.compute.manager [instance: f42056e5-52cb-4d69-8022-ca643c49194e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1121.164380] env[68906]: ERROR nova.compute.manager [instance: f42056e5-52cb-4d69-8022-ca643c49194e] return evt.wait() [ 1121.164380] env[68906]: ERROR nova.compute.manager [instance: f42056e5-52cb-4d69-8022-ca643c49194e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1121.164737] env[68906]: ERROR nova.compute.manager [instance: f42056e5-52cb-4d69-8022-ca643c49194e] result = hub.switch() [ 1121.164737] env[68906]: ERROR nova.compute.manager [instance: f42056e5-52cb-4d69-8022-ca643c49194e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1121.164737] env[68906]: ERROR nova.compute.manager [instance: f42056e5-52cb-4d69-8022-ca643c49194e] return self.greenlet.switch() [ 1121.164737] env[68906]: ERROR nova.compute.manager [instance: f42056e5-52cb-4d69-8022-ca643c49194e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1121.164737] env[68906]: ERROR nova.compute.manager [instance: f42056e5-52cb-4d69-8022-ca643c49194e] self.f(*self.args, **self.kw) [ 1121.164737] env[68906]: ERROR nova.compute.manager [instance: f42056e5-52cb-4d69-8022-ca643c49194e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1121.164737] env[68906]: ERROR nova.compute.manager [instance: f42056e5-52cb-4d69-8022-ca643c49194e] raise exceptions.translate_fault(task_info.error) [ 1121.164737] env[68906]: ERROR nova.compute.manager [instance: f42056e5-52cb-4d69-8022-ca643c49194e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1121.164737] env[68906]: ERROR nova.compute.manager [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Faults: ['InvalidArgument'] [ 1121.164737] env[68906]: ERROR nova.compute.manager [instance: f42056e5-52cb-4d69-8022-ca643c49194e] [ 1121.164737] env[68906]: INFO nova.compute.manager [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Terminating instance [ 1121.165034] env[68906]: DEBUG oslo_concurrency.lockutils [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1121.165034] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1121.165034] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-07bfbf89-326d-4fdc-8956-155173a13954 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1121.167020] env[68906]: DEBUG nova.compute.manager [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1121.167225] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1121.167942] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-564bab57-544d-4f0b-a39d-54daf6bddc31 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1121.174690] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Unregistering the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1121.174943] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-ddd56fe1-6b98-42b1-b8fd-9f3fbf6941b3 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1121.177082] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1121.177253] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68906) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1121.178169] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8197e003-83f5-4e92-b481-9547e67bc2c7 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1121.183419] env[68906]: DEBUG oslo_vmware.api [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Waiting for the task: (returnval){ [ 1121.183419] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]522a904b-161a-a6ed-df7b-40ed001e54f5" [ 1121.183419] env[68906]: _type = "Task" [ 1121.183419] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1121.196955] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Preparing fetch location {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1121.197282] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Creating directory with path [datastore2] vmware_temp/7b87e44a-0226-402c-821e-65c8f4798883/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1121.197504] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a23cf5c1-2899-48a6-a95c-b89e0ebf4791 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1121.223166] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Created directory with path [datastore2] vmware_temp/7b87e44a-0226-402c-821e-65c8f4798883/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1121.223401] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Fetch image to [datastore2] vmware_temp/7b87e44a-0226-402c-821e-65c8f4798883/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1121.223575] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to [datastore2] vmware_temp/7b87e44a-0226-402c-821e-65c8f4798883/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1121.224381] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-20716c4c-1e9f-46eb-af09-b5f8bf90c10e {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1121.231830] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd243f86-ed49-41b4-816a-987de498bca3 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1121.241239] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d9e5f168-5b61-44b8-af86-28f966b1f225 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1121.274223] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e090f60-d46c-47b8-9a41-3a62c4fecef2 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1121.276846] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Unregistered the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1121.277041] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Deleting contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1121.277223] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Deleting the datastore file [datastore2] f42056e5-52cb-4d69-8022-ca643c49194e {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1121.277455] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-a9641366-fa33-4194-9bea-db6ad7771cfb {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1121.282366] env[68906]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-6a677537-29d5-47c4-9fc6-9a005a4178c8 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1121.285036] env[68906]: DEBUG oslo_vmware.api [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Waiting for the task: (returnval){ [ 1121.285036] env[68906]: value = "task-3475357" [ 1121.285036] env[68906]: _type = "Task" [ 1121.285036] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1121.293626] env[68906]: DEBUG oslo_vmware.api [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Task: {'id': task-3475357, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1121.307236] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1121.367301] env[68906]: DEBUG oslo_vmware.rw_handles [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7b87e44a-0226-402c-821e-65c8f4798883/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1121.431873] env[68906]: DEBUG oslo_vmware.rw_handles [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Completed reading data from the image iterator. {{(pid=68906) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1121.432098] env[68906]: DEBUG oslo_vmware.rw_handles [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7b87e44a-0226-402c-821e-65c8f4798883/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1121.795812] env[68906]: DEBUG oslo_vmware.api [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Task: {'id': task-3475357, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.28175} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1121.796075] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Deleted the datastore file {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1121.796262] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Deleted contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1121.796438] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1121.796607] env[68906]: INFO nova.compute.manager [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Took 0.63 seconds to destroy the instance on the hypervisor. [ 1121.798723] env[68906]: DEBUG nova.compute.claims [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Aborting claim: {{(pid=68906) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1121.798897] env[68906]: DEBUG oslo_concurrency.lockutils [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1121.799144] env[68906]: DEBUG oslo_concurrency.lockutils [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1122.141491] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1122.230104] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-13c1f737-0d11-400d-99e4-f9d5d747e449 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1122.237939] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a36db83-bc03-45f4-958d-b3533565b5c7 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1122.270747] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2bd4e29b-3aa3-4656-add4-da0b6bb3fba7 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1122.278292] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b6039ed3-d3c9-4e33-a2b9-1dff1c06bb41 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1122.291765] env[68906]: DEBUG nova.compute.provider_tree [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1122.303678] env[68906]: DEBUG nova.scheduler.client.report [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1122.324784] env[68906]: DEBUG oslo_concurrency.lockutils [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.523s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1122.324784] env[68906]: ERROR nova.compute.manager [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1122.324784] env[68906]: Faults: ['InvalidArgument'] [ 1122.324784] env[68906]: ERROR nova.compute.manager [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Traceback (most recent call last): [ 1122.324784] env[68906]: ERROR nova.compute.manager [instance: f42056e5-52cb-4d69-8022-ca643c49194e] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1122.324784] env[68906]: ERROR nova.compute.manager [instance: f42056e5-52cb-4d69-8022-ca643c49194e] self.driver.spawn(context, instance, image_meta, [ 1122.324784] env[68906]: ERROR nova.compute.manager [instance: f42056e5-52cb-4d69-8022-ca643c49194e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1122.324784] env[68906]: ERROR nova.compute.manager [instance: f42056e5-52cb-4d69-8022-ca643c49194e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1122.324784] env[68906]: ERROR nova.compute.manager [instance: f42056e5-52cb-4d69-8022-ca643c49194e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1122.324784] env[68906]: ERROR nova.compute.manager [instance: f42056e5-52cb-4d69-8022-ca643c49194e] self._fetch_image_if_missing(context, vi) [ 1122.325280] env[68906]: ERROR nova.compute.manager [instance: f42056e5-52cb-4d69-8022-ca643c49194e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1122.325280] env[68906]: ERROR nova.compute.manager [instance: f42056e5-52cb-4d69-8022-ca643c49194e] image_cache(vi, tmp_image_ds_loc) [ 1122.325280] env[68906]: ERROR nova.compute.manager [instance: f42056e5-52cb-4d69-8022-ca643c49194e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1122.325280] env[68906]: ERROR nova.compute.manager [instance: f42056e5-52cb-4d69-8022-ca643c49194e] vm_util.copy_virtual_disk( [ 1122.325280] env[68906]: ERROR nova.compute.manager [instance: f42056e5-52cb-4d69-8022-ca643c49194e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1122.325280] env[68906]: ERROR nova.compute.manager [instance: f42056e5-52cb-4d69-8022-ca643c49194e] session._wait_for_task(vmdk_copy_task) [ 1122.325280] env[68906]: ERROR nova.compute.manager [instance: f42056e5-52cb-4d69-8022-ca643c49194e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1122.325280] env[68906]: ERROR nova.compute.manager [instance: f42056e5-52cb-4d69-8022-ca643c49194e] return self.wait_for_task(task_ref) [ 1122.325280] env[68906]: ERROR nova.compute.manager [instance: f42056e5-52cb-4d69-8022-ca643c49194e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1122.325280] env[68906]: ERROR nova.compute.manager [instance: f42056e5-52cb-4d69-8022-ca643c49194e] return evt.wait() [ 1122.325280] env[68906]: ERROR nova.compute.manager [instance: f42056e5-52cb-4d69-8022-ca643c49194e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1122.325280] env[68906]: ERROR nova.compute.manager [instance: f42056e5-52cb-4d69-8022-ca643c49194e] result = hub.switch() [ 1122.325280] env[68906]: ERROR nova.compute.manager [instance: f42056e5-52cb-4d69-8022-ca643c49194e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1122.325654] env[68906]: ERROR nova.compute.manager [instance: f42056e5-52cb-4d69-8022-ca643c49194e] return self.greenlet.switch() [ 1122.325654] env[68906]: ERROR nova.compute.manager [instance: f42056e5-52cb-4d69-8022-ca643c49194e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1122.325654] env[68906]: ERROR nova.compute.manager [instance: f42056e5-52cb-4d69-8022-ca643c49194e] self.f(*self.args, **self.kw) [ 1122.325654] env[68906]: ERROR nova.compute.manager [instance: f42056e5-52cb-4d69-8022-ca643c49194e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1122.325654] env[68906]: ERROR nova.compute.manager [instance: f42056e5-52cb-4d69-8022-ca643c49194e] raise exceptions.translate_fault(task_info.error) [ 1122.325654] env[68906]: ERROR nova.compute.manager [instance: f42056e5-52cb-4d69-8022-ca643c49194e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1122.325654] env[68906]: ERROR nova.compute.manager [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Faults: ['InvalidArgument'] [ 1122.325654] env[68906]: ERROR nova.compute.manager [instance: f42056e5-52cb-4d69-8022-ca643c49194e] [ 1122.325654] env[68906]: DEBUG nova.compute.utils [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] VimFaultException {{(pid=68906) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1122.326435] env[68906]: DEBUG nova.compute.manager [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Build of instance f42056e5-52cb-4d69-8022-ca643c49194e was re-scheduled: A specified parameter was not correct: fileType [ 1122.326435] env[68906]: Faults: ['InvalidArgument'] {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1122.327025] env[68906]: DEBUG nova.compute.manager [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Unplugging VIFs for instance {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1122.327321] env[68906]: DEBUG nova.compute.manager [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1122.327665] env[68906]: DEBUG nova.compute.manager [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1122.328266] env[68906]: DEBUG nova.network.neutron [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1122.892722] env[68906]: DEBUG nova.network.neutron [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1122.909173] env[68906]: INFO nova.compute.manager [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Took 0.58 seconds to deallocate network for instance. [ 1123.055015] env[68906]: INFO nova.scheduler.client.report [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Deleted allocations for instance f42056e5-52cb-4d69-8022-ca643c49194e [ 1123.088814] env[68906]: DEBUG oslo_concurrency.lockutils [None req-dac96d6f-6d7b-49e4-904b-bb16242f78bb tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Lock "f42056e5-52cb-4d69-8022-ca643c49194e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 478.925s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1123.090322] env[68906]: DEBUG oslo_concurrency.lockutils [None req-ca97fdf4-ff68-4e8e-b6cb-46693d4ecb80 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Lock "f42056e5-52cb-4d69-8022-ca643c49194e" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 80.874s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1123.090552] env[68906]: DEBUG oslo_concurrency.lockutils [None req-ca97fdf4-ff68-4e8e-b6cb-46693d4ecb80 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Acquiring lock "f42056e5-52cb-4d69-8022-ca643c49194e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1123.090763] env[68906]: DEBUG oslo_concurrency.lockutils [None req-ca97fdf4-ff68-4e8e-b6cb-46693d4ecb80 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Lock "f42056e5-52cb-4d69-8022-ca643c49194e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1123.090937] env[68906]: DEBUG oslo_concurrency.lockutils [None req-ca97fdf4-ff68-4e8e-b6cb-46693d4ecb80 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Lock "f42056e5-52cb-4d69-8022-ca643c49194e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1123.092930] env[68906]: INFO nova.compute.manager [None req-ca97fdf4-ff68-4e8e-b6cb-46693d4ecb80 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Terminating instance [ 1123.094495] env[68906]: DEBUG nova.compute.manager [None req-ca97fdf4-ff68-4e8e-b6cb-46693d4ecb80 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1123.094694] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-ca97fdf4-ff68-4e8e-b6cb-46693d4ecb80 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1123.095157] env[68906]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-91f76956-23ee-40f2-baf5-82d7c92b2595 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1123.103919] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b14bf813-0162-4d29-a2b9-b1392ff5f508 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1123.116135] env[68906]: DEBUG nova.compute.manager [None req-2fcfd0ef-6d38-4dd9-8a43-b17729caedcf tempest-ServerRescueNegativeTestJSON-608372629 tempest-ServerRescueNegativeTestJSON-608372629-project-member] [instance: a37ef3ce-1c29-48fe-b9c6-023da5b3db71] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1123.138810] env[68906]: WARNING nova.virt.vmwareapi.vmops [None req-ca97fdf4-ff68-4e8e-b6cb-46693d4ecb80 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance f42056e5-52cb-4d69-8022-ca643c49194e could not be found. [ 1123.138810] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-ca97fdf4-ff68-4e8e-b6cb-46693d4ecb80 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1123.138810] env[68906]: INFO nova.compute.manager [None req-ca97fdf4-ff68-4e8e-b6cb-46693d4ecb80 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1123.138810] env[68906]: DEBUG oslo.service.loopingcall [None req-ca97fdf4-ff68-4e8e-b6cb-46693d4ecb80 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1123.139024] env[68906]: DEBUG nova.compute.manager [-] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1123.139060] env[68906]: DEBUG nova.network.neutron [-] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1123.142132] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1123.149968] env[68906]: DEBUG nova.compute.manager [None req-2fcfd0ef-6d38-4dd9-8a43-b17729caedcf tempest-ServerRescueNegativeTestJSON-608372629 tempest-ServerRescueNegativeTestJSON-608372629-project-member] [instance: a37ef3ce-1c29-48fe-b9c6-023da5b3db71] Instance disappeared before build. {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1123.173853] env[68906]: DEBUG oslo_concurrency.lockutils [None req-2fcfd0ef-6d38-4dd9-8a43-b17729caedcf tempest-ServerRescueNegativeTestJSON-608372629 tempest-ServerRescueNegativeTestJSON-608372629-project-member] Lock "a37ef3ce-1c29-48fe-b9c6-023da5b3db71" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 242.648s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1123.186089] env[68906]: DEBUG nova.compute.manager [None req-3ef12865-803b-471a-95a5-dc05f12c5571 tempest-ServerRescueNegativeTestJSON-608372629 tempest-ServerRescueNegativeTestJSON-608372629-project-member] [instance: ee17e223-bec7-4541-8cb2-25e4a6c32b34] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1123.188541] env[68906]: DEBUG nova.network.neutron [-] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1123.201412] env[68906]: INFO nova.compute.manager [-] [instance: f42056e5-52cb-4d69-8022-ca643c49194e] Took 0.06 seconds to deallocate network for instance. [ 1123.225648] env[68906]: DEBUG nova.compute.manager [None req-3ef12865-803b-471a-95a5-dc05f12c5571 tempest-ServerRescueNegativeTestJSON-608372629 tempest-ServerRescueNegativeTestJSON-608372629-project-member] [instance: ee17e223-bec7-4541-8cb2-25e4a6c32b34] Instance disappeared before build. {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1123.327665] env[68906]: DEBUG oslo_concurrency.lockutils [None req-3ef12865-803b-471a-95a5-dc05f12c5571 tempest-ServerRescueNegativeTestJSON-608372629 tempest-ServerRescueNegativeTestJSON-608372629-project-member] Lock "ee17e223-bec7-4541-8cb2-25e4a6c32b34" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 239.529s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1123.356361] env[68906]: DEBUG nova.compute.manager [None req-759a674a-a9dc-4cda-86d0-b5ec5eba1b78 tempest-ServerActionsTestOtherB-612778985 tempest-ServerActionsTestOtherB-612778985-project-member] [instance: 56f936b4-680d-40db-84ab-8eb319f6ee83] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1123.390968] env[68906]: DEBUG nova.compute.manager [None req-759a674a-a9dc-4cda-86d0-b5ec5eba1b78 tempest-ServerActionsTestOtherB-612778985 tempest-ServerActionsTestOtherB-612778985-project-member] [instance: 56f936b4-680d-40db-84ab-8eb319f6ee83] Instance disappeared before build. {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1123.423364] env[68906]: DEBUG oslo_concurrency.lockutils [None req-759a674a-a9dc-4cda-86d0-b5ec5eba1b78 tempest-ServerActionsTestOtherB-612778985 tempest-ServerActionsTestOtherB-612778985-project-member] Lock "56f936b4-680d-40db-84ab-8eb319f6ee83" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 238.481s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1123.451859] env[68906]: DEBUG nova.compute.manager [None req-5dfc4571-f06c-4c81-9226-1d215bbb2db9 tempest-ServerMetadataNegativeTestJSON-343818070 tempest-ServerMetadataNegativeTestJSON-343818070-project-member] [instance: 3ba4a60f-6c41-4e1e-8928-f1b95b885028] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1123.484245] env[68906]: DEBUG nova.compute.manager [None req-5dfc4571-f06c-4c81-9226-1d215bbb2db9 tempest-ServerMetadataNegativeTestJSON-343818070 tempest-ServerMetadataNegativeTestJSON-343818070-project-member] [instance: 3ba4a60f-6c41-4e1e-8928-f1b95b885028] Instance disappeared before build. {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1123.491165] env[68906]: DEBUG oslo_concurrency.lockutils [None req-ca97fdf4-ff68-4e8e-b6cb-46693d4ecb80 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Lock "f42056e5-52cb-4d69-8022-ca643c49194e" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.401s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1123.512277] env[68906]: DEBUG oslo_concurrency.lockutils [None req-5dfc4571-f06c-4c81-9226-1d215bbb2db9 tempest-ServerMetadataNegativeTestJSON-343818070 tempest-ServerMetadataNegativeTestJSON-343818070-project-member] Lock "3ba4a60f-6c41-4e1e-8928-f1b95b885028" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 238.571s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1123.521521] env[68906]: DEBUG nova.compute.manager [None req-8d8d8524-17bd-4588-8951-e1f296fbac81 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] [instance: faec727e-bd92-4201-aaca-5863208be265] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1123.546176] env[68906]: DEBUG nova.compute.manager [None req-8d8d8524-17bd-4588-8951-e1f296fbac81 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] [instance: faec727e-bd92-4201-aaca-5863208be265] Instance disappeared before build. {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1123.573638] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8d8d8524-17bd-4588-8951-e1f296fbac81 tempest-MigrationsAdminTest-890819033 tempest-MigrationsAdminTest-890819033-project-member] Lock "faec727e-bd92-4201-aaca-5863208be265" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 237.323s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1123.584042] env[68906]: DEBUG nova.compute.manager [None req-1475f613-fd4b-4f1d-8fac-623e658f362f tempest-AttachVolumeShelveTestJSON-1059946953 tempest-AttachVolumeShelveTestJSON-1059946953-project-member] [instance: 18a5c392-b836-4d2a-bb77-d4af0b9fdb81] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1123.610012] env[68906]: DEBUG nova.compute.manager [None req-1475f613-fd4b-4f1d-8fac-623e658f362f tempest-AttachVolumeShelveTestJSON-1059946953 tempest-AttachVolumeShelveTestJSON-1059946953-project-member] [instance: 18a5c392-b836-4d2a-bb77-d4af0b9fdb81] Instance disappeared before build. {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1123.634740] env[68906]: DEBUG oslo_concurrency.lockutils [None req-1475f613-fd4b-4f1d-8fac-623e658f362f tempest-AttachVolumeShelveTestJSON-1059946953 tempest-AttachVolumeShelveTestJSON-1059946953-project-member] Lock "18a5c392-b836-4d2a-bb77-d4af0b9fdb81" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 233.427s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1123.649236] env[68906]: DEBUG nova.compute.manager [None req-237495d6-6000-404f-b4a3-e46a4b8ba4ce tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] [instance: 2e6de0b1-335b-49bd-aa15-3fd4cc4b4e9e] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1123.672279] env[68906]: DEBUG nova.compute.manager [None req-237495d6-6000-404f-b4a3-e46a4b8ba4ce tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] [instance: 2e6de0b1-335b-49bd-aa15-3fd4cc4b4e9e] Instance disappeared before build. {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1123.707731] env[68906]: DEBUG oslo_concurrency.lockutils [None req-237495d6-6000-404f-b4a3-e46a4b8ba4ce tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] Lock "2e6de0b1-335b-49bd-aa15-3fd4cc4b4e9e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 226.419s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1123.718630] env[68906]: DEBUG nova.compute.manager [None req-8595702b-bc02-4dde-85d1-b5b6b00301b0 tempest-ImagesOneServerTestJSON-2105933643 tempest-ImagesOneServerTestJSON-2105933643-project-member] [instance: d71bae07-54c1-427b-bfe1-2467369cd80c] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1123.745327] env[68906]: DEBUG nova.compute.manager [None req-8595702b-bc02-4dde-85d1-b5b6b00301b0 tempest-ImagesOneServerTestJSON-2105933643 tempest-ImagesOneServerTestJSON-2105933643-project-member] [instance: d71bae07-54c1-427b-bfe1-2467369cd80c] Instance disappeared before build. {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1123.773041] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8595702b-bc02-4dde-85d1-b5b6b00301b0 tempest-ImagesOneServerTestJSON-2105933643 tempest-ImagesOneServerTestJSON-2105933643-project-member] Lock "d71bae07-54c1-427b-bfe1-2467369cd80c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 225.931s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1123.787120] env[68906]: DEBUG nova.compute.manager [None req-00e83854-94e1-4ab2-8698-d76f53e7ae92 tempest-ServerActionsV293TestJSON-1770613532 tempest-ServerActionsV293TestJSON-1770613532-project-member] [instance: 682f0e61-471f-47fb-98de-02449b17d241] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1123.820363] env[68906]: DEBUG nova.compute.manager [None req-00e83854-94e1-4ab2-8698-d76f53e7ae92 tempest-ServerActionsV293TestJSON-1770613532 tempest-ServerActionsV293TestJSON-1770613532-project-member] [instance: 682f0e61-471f-47fb-98de-02449b17d241] Instance disappeared before build. {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1123.846284] env[68906]: DEBUG oslo_concurrency.lockutils [None req-00e83854-94e1-4ab2-8698-d76f53e7ae92 tempest-ServerActionsV293TestJSON-1770613532 tempest-ServerActionsV293TestJSON-1770613532-project-member] Lock "682f0e61-471f-47fb-98de-02449b17d241" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 198.902s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1123.860367] env[68906]: DEBUG nova.compute.manager [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1123.930244] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1123.930900] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1123.932460] env[68906]: INFO nova.compute.claims [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1124.144014] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager.update_available_resource {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1124.159434] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1124.353923] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-72b377aa-13d4-48ef-9d6d-aaa29aa6bd4d {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1124.361939] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-efb49c5f-0499-411b-bf5c-8a8d6420dcef {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1124.394014] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a77835ff-0e97-48d6-ba49-52591e8f80eb {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1124.402146] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b8baa7e1-0f1f-463e-9e90-767e0ccdfc5c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1124.415570] env[68906]: DEBUG nova.compute.provider_tree [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1124.424414] env[68906]: DEBUG nova.scheduler.client.report [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1124.463837] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.533s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1124.464383] env[68906]: DEBUG nova.compute.manager [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Start building networks asynchronously for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1124.470399] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.308s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1124.470399] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1124.470399] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68906) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1124.470399] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a1504c7-3a1c-44b3-ad5c-34da43108786 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1124.478508] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-015708f9-deef-4286-ade2-0312a59db0b5 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1124.492785] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb503ab4-6ab6-4880-b863-562aa4156a43 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1124.499468] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b8379638-b14b-4790-9401-a918bb3c59d3 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1124.529616] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180971MB free_disk=93GB free_vcpus=48 pci_devices=None {{(pid=68906) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1124.529833] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1124.530079] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1124.535522] env[68906]: DEBUG nova.compute.utils [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Using /dev/sd instead of None {{(pid=68906) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1124.536781] env[68906]: DEBUG nova.compute.manager [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Allocating IP information in the background. {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1124.537197] env[68906]: DEBUG nova.network.neutron [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] allocate_for_instance() {{(pid=68906) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1124.550420] env[68906]: DEBUG nova.compute.manager [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Start building block device mappings for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1124.623390] env[68906]: DEBUG nova.policy [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '64ca41bce57f4577bb3e0867432e5c61', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '82f4fba881f4486786656229113cd0d0', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68906) authorize /opt/stack/nova/nova/policy.py:203}} [ 1124.642444] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 13eebe4e-5984-46c3-bb73-cd783ad45df6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1124.642616] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 9a2d2803-34b1-40f7-9349-e5734a217e18 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1124.642801] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance a7e0a28f-42a5-442e-b962-07771d2e6a27 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1124.642911] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance eb81e9b1-b573-4d7c-9ede-f8b32a43a201 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1124.643056] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1124.643162] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance acc11633-a489-4d8f-ad76-f17049a91545 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1124.643275] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance e7286888-d79d-4632-9c06-69c1ef47fa50 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1124.643388] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 641cca5b-d749-4331-a5e0-8acb6d47cba2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1124.643500] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1124.643699] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 4d36bb91-0cde-44cb-8706-d17740a9cf50 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1124.658567] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance db011373-7285-4882-8bce-d39cfa22fe80 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1124.669491] env[68906]: DEBUG nova.compute.manager [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Start spawning the instance on the hypervisor. {{(pid=68906) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1124.673103] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 1fdb401a-ac25-4418-803c-fc0b2297f2d4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1124.688935] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance d6be39b6-8bbc-4657-9ceb-9a4110c29c53 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1124.703335] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 4a616d87-7b55-4b1f-b938-9d9261e8b2cd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1124.723375] env[68906]: DEBUG nova.virt.hardware [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T13:00:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T13:00:23Z,direct_url=,disk_format='vmdk',id=b1400c31-d33b-4e13-944f-4c645e62493e,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='1ae7bf3a375d41c6af5e7536af51ffd1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T13:00:24Z,virtual_size=,visibility=), allow threads: False {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1124.723517] env[68906]: DEBUG nova.virt.hardware [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Flavor limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1124.723560] env[68906]: DEBUG nova.virt.hardware [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Image limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1124.723741] env[68906]: DEBUG nova.virt.hardware [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Flavor pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1124.723885] env[68906]: DEBUG nova.virt.hardware [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Image pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1124.724046] env[68906]: DEBUG nova.virt.hardware [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1124.724269] env[68906]: DEBUG nova.virt.hardware [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1124.724431] env[68906]: DEBUG nova.virt.hardware [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1124.724602] env[68906]: DEBUG nova.virt.hardware [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Got 1 possible topologies {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1124.724772] env[68906]: DEBUG nova.virt.hardware [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1124.724940] env[68906]: DEBUG nova.virt.hardware [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1124.726805] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a2d69a2a-5c6b-4a7c-b334-ba6018edcf00 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1124.731962] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 38248e62-53b8-402e-aa29-d9a445b0af9d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1124.738874] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e3124845-7dcd-4bf0-9934-c1c6894a08d5 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1124.743957] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 60ba9060-c3c3-4561-b9e9-e2df08e2e38b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1124.757690] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance e8a14af6-ab4f-407e-943d-4dd3a46c8711 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1124.772745] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 57078f52-8070-480e-b8ea-278ef759f0a3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1124.787160] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1124.798478] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance e0e595e3-e47e-4cf1-8977-f004eca942d1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1124.810715] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 3c36e8a4-da45-457e-b4ef-001f4a4e595f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1124.824494] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 582a086e-5122-41f2-8fb8-513b3734eef4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1124.835441] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 159edc16-55bb-46eb-8fa9-7da7c1f36cd0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1124.847521] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 13b471c5-c86e-4b55-a231-159b2219de2f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1124.866908] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance d01b8b11-bc3b-47dc-8687-a111c1453ed9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1124.867223] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1124.867381] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1125.311658] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cd69768c-e901-49ee-965c-248fc75bd9af {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1125.320039] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-573138c4-52f0-44d0-96cc-b6bf0d49fe11 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1125.356756] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-225b34dc-617b-42f8-9670-8262b735688c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1125.369485] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d816efc-55d4-466d-a03d-304d42166897 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1125.380626] env[68906]: DEBUG nova.compute.provider_tree [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1125.426168] env[68906]: DEBUG nova.scheduler.client.report [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1125.451388] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68906) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1125.452509] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.922s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1125.674122] env[68906]: DEBUG nova.network.neutron [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Successfully created port: 98f87d2b-3488-4754-a82b-6211e5dfa8ca {{(pid=68906) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1126.766237] env[68906]: DEBUG nova.compute.manager [req-849c68c1-36f1-488d-b6e0-8605c1f4d2a3 req-ef418d91-082e-4669-81ed-1224c96ef208 service nova] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Received event network-vif-plugged-98f87d2b-3488-4754-a82b-6211e5dfa8ca {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1126.766537] env[68906]: DEBUG oslo_concurrency.lockutils [req-849c68c1-36f1-488d-b6e0-8605c1f4d2a3 req-ef418d91-082e-4669-81ed-1224c96ef208 service nova] Acquiring lock "4d36bb91-0cde-44cb-8706-d17740a9cf50-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1126.766681] env[68906]: DEBUG oslo_concurrency.lockutils [req-849c68c1-36f1-488d-b6e0-8605c1f4d2a3 req-ef418d91-082e-4669-81ed-1224c96ef208 service nova] Lock "4d36bb91-0cde-44cb-8706-d17740a9cf50-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1126.766853] env[68906]: DEBUG oslo_concurrency.lockutils [req-849c68c1-36f1-488d-b6e0-8605c1f4d2a3 req-ef418d91-082e-4669-81ed-1224c96ef208 service nova] Lock "4d36bb91-0cde-44cb-8706-d17740a9cf50-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1126.767085] env[68906]: DEBUG nova.compute.manager [req-849c68c1-36f1-488d-b6e0-8605c1f4d2a3 req-ef418d91-082e-4669-81ed-1224c96ef208 service nova] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] No waiting events found dispatching network-vif-plugged-98f87d2b-3488-4754-a82b-6211e5dfa8ca {{(pid=68906) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1126.767262] env[68906]: WARNING nova.compute.manager [req-849c68c1-36f1-488d-b6e0-8605c1f4d2a3 req-ef418d91-082e-4669-81ed-1224c96ef208 service nova] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Received unexpected event network-vif-plugged-98f87d2b-3488-4754-a82b-6211e5dfa8ca for instance with vm_state building and task_state spawning. [ 1126.864098] env[68906]: DEBUG nova.network.neutron [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Successfully updated port: 98f87d2b-3488-4754-a82b-6211e5dfa8ca {{(pid=68906) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1126.886381] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Acquiring lock "refresh_cache-4d36bb91-0cde-44cb-8706-d17740a9cf50" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1126.886659] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Acquired lock "refresh_cache-4d36bb91-0cde-44cb-8706-d17740a9cf50" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1126.886879] env[68906]: DEBUG nova.network.neutron [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Building network info cache for instance {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1126.944394] env[68906]: DEBUG nova.network.neutron [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Instance cache missing network info. {{(pid=68906) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1127.232180] env[68906]: DEBUG nova.network.neutron [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Updating instance_info_cache with network_info: [{"id": "98f87d2b-3488-4754-a82b-6211e5dfa8ca", "address": "fa:16:3e:d0:70:65", "network": {"id": "81fcd64c-6798-46e1-9057-ad60eaf63dda", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1786758875-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "82f4fba881f4486786656229113cd0d0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "b49c5024-2ced-42ca-90cc-6066766d43e6", "external-id": "nsx-vlan-transportzone-239", "segmentation_id": 239, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap98f87d2b-34", "ovs_interfaceid": "98f87d2b-3488-4754-a82b-6211e5dfa8ca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1127.257666] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Releasing lock "refresh_cache-4d36bb91-0cde-44cb-8706-d17740a9cf50" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1127.257986] env[68906]: DEBUG nova.compute.manager [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Instance network_info: |[{"id": "98f87d2b-3488-4754-a82b-6211e5dfa8ca", "address": "fa:16:3e:d0:70:65", "network": {"id": "81fcd64c-6798-46e1-9057-ad60eaf63dda", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1786758875-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "82f4fba881f4486786656229113cd0d0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "b49c5024-2ced-42ca-90cc-6066766d43e6", "external-id": "nsx-vlan-transportzone-239", "segmentation_id": 239, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap98f87d2b-34", "ovs_interfaceid": "98f87d2b-3488-4754-a82b-6211e5dfa8ca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1127.258458] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:d0:70:65', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'b49c5024-2ced-42ca-90cc-6066766d43e6', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '98f87d2b-3488-4754-a82b-6211e5dfa8ca', 'vif_model': 'vmxnet3'}] {{(pid=68906) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1127.266865] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Creating folder: Project (82f4fba881f4486786656229113cd0d0). Parent ref: group-v694750. {{(pid=68906) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1127.267438] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f667a1e4-2474-4629-b504-0ffc05b0915f {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1127.278668] env[68906]: INFO nova.virt.vmwareapi.vm_util [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Created folder: Project (82f4fba881f4486786656229113cd0d0) in parent group-v694750. [ 1127.278841] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Creating folder: Instances. Parent ref: group-v694814. {{(pid=68906) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1127.279076] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-07960275-300b-4d84-b8c3-796f267c75a2 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1127.287407] env[68906]: INFO nova.virt.vmwareapi.vm_util [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Created folder: Instances in parent group-v694814. [ 1127.287640] env[68906]: DEBUG oslo.service.loopingcall [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1127.288347] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Creating VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1127.288347] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-1f4f3ab8-612c-4911-927c-cb9e41cf28be {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1127.307959] env[68906]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1127.307959] env[68906]: value = "task-3475360" [ 1127.307959] env[68906]: _type = "Task" [ 1127.307959] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1127.315891] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475360, 'name': CreateVM_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1127.818256] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475360, 'name': CreateVM_Task} progress is 25%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1128.321752] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475360, 'name': CreateVM_Task, 'duration_secs': 0.798526} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1128.322044] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Created VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1128.325141] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1128.325360] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1128.325674] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1128.325957] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-17020293-8a0c-4b08-8720-9b81a3ed2ea1 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1128.332979] env[68906]: DEBUG oslo_vmware.api [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Waiting for the task: (returnval){ [ 1128.332979] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]52ed4670-ead8-2f38-df0f-6c181fbef047" [ 1128.332979] env[68906]: _type = "Task" [ 1128.332979] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1128.341095] env[68906]: DEBUG oslo_vmware.api [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]52ed4670-ead8-2f38-df0f-6c181fbef047, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1128.843190] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1128.843455] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Processing image b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1128.843663] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1129.064654] env[68906]: DEBUG nova.compute.manager [req-530d08bf-2478-4162-a41c-852ba1e8ed61 req-3ac71ed7-01b5-4a24-be42-5a21a34465cf service nova] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Received event network-changed-98f87d2b-3488-4754-a82b-6211e5dfa8ca {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1129.064780] env[68906]: DEBUG nova.compute.manager [req-530d08bf-2478-4162-a41c-852ba1e8ed61 req-3ac71ed7-01b5-4a24-be42-5a21a34465cf service nova] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Refreshing instance network info cache due to event network-changed-98f87d2b-3488-4754-a82b-6211e5dfa8ca. {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1129.064988] env[68906]: DEBUG oslo_concurrency.lockutils [req-530d08bf-2478-4162-a41c-852ba1e8ed61 req-3ac71ed7-01b5-4a24-be42-5a21a34465cf service nova] Acquiring lock "refresh_cache-4d36bb91-0cde-44cb-8706-d17740a9cf50" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1129.065135] env[68906]: DEBUG oslo_concurrency.lockutils [req-530d08bf-2478-4162-a41c-852ba1e8ed61 req-3ac71ed7-01b5-4a24-be42-5a21a34465cf service nova] Acquired lock "refresh_cache-4d36bb91-0cde-44cb-8706-d17740a9cf50" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1129.065300] env[68906]: DEBUG nova.network.neutron [req-530d08bf-2478-4162-a41c-852ba1e8ed61 req-3ac71ed7-01b5-4a24-be42-5a21a34465cf service nova] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Refreshing network info cache for port 98f87d2b-3488-4754-a82b-6211e5dfa8ca {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1129.460126] env[68906]: DEBUG nova.network.neutron [req-530d08bf-2478-4162-a41c-852ba1e8ed61 req-3ac71ed7-01b5-4a24-be42-5a21a34465cf service nova] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Updated VIF entry in instance network info cache for port 98f87d2b-3488-4754-a82b-6211e5dfa8ca. {{(pid=68906) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1129.460488] env[68906]: DEBUG nova.network.neutron [req-530d08bf-2478-4162-a41c-852ba1e8ed61 req-3ac71ed7-01b5-4a24-be42-5a21a34465cf service nova] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Updating instance_info_cache with network_info: [{"id": "98f87d2b-3488-4754-a82b-6211e5dfa8ca", "address": "fa:16:3e:d0:70:65", "network": {"id": "81fcd64c-6798-46e1-9057-ad60eaf63dda", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-1786758875-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "82f4fba881f4486786656229113cd0d0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "b49c5024-2ced-42ca-90cc-6066766d43e6", "external-id": "nsx-vlan-transportzone-239", "segmentation_id": 239, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap98f87d2b-34", "ovs_interfaceid": "98f87d2b-3488-4754-a82b-6211e5dfa8ca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1129.471331] env[68906]: DEBUG oslo_concurrency.lockutils [req-530d08bf-2478-4162-a41c-852ba1e8ed61 req-3ac71ed7-01b5-4a24-be42-5a21a34465cf service nova] Releasing lock "refresh_cache-4d36bb91-0cde-44cb-8706-d17740a9cf50" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1131.326580] env[68906]: DEBUG oslo_concurrency.lockutils [None req-f7b75175-bd21-425d-83cf-3460875241d7 tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Acquiring lock "4d36bb91-0cde-44cb-8706-d17740a9cf50" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1134.616149] env[68906]: DEBUG oslo_concurrency.lockutils [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Acquiring lock "7466df8a-59a9-49b9-bff7-c4efbeae3eee" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1134.616149] env[68906]: DEBUG oslo_concurrency.lockutils [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Lock "7466df8a-59a9-49b9-bff7-c4efbeae3eee" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1167.658857] env[68906]: WARNING oslo_vmware.rw_handles [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1167.658857] env[68906]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1167.658857] env[68906]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1167.658857] env[68906]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1167.658857] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1167.658857] env[68906]: ERROR oslo_vmware.rw_handles response.begin() [ 1167.658857] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1167.658857] env[68906]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1167.658857] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1167.658857] env[68906]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1167.658857] env[68906]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1167.658857] env[68906]: ERROR oslo_vmware.rw_handles [ 1167.659610] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Downloaded image file data b1400c31-d33b-4e13-944f-4c645e62493e to vmware_temp/7b87e44a-0226-402c-821e-65c8f4798883/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1167.661306] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Caching image {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1167.661571] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Copying Virtual Disk [datastore2] vmware_temp/7b87e44a-0226-402c-821e-65c8f4798883/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk to [datastore2] vmware_temp/7b87e44a-0226-402c-821e-65c8f4798883/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk {{(pid=68906) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1167.661866] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-643d62c7-16cc-4165-b3e6-f402a6334a4e {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1167.670507] env[68906]: DEBUG oslo_vmware.api [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Waiting for the task: (returnval){ [ 1167.670507] env[68906]: value = "task-3475361" [ 1167.670507] env[68906]: _type = "Task" [ 1167.670507] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1167.678588] env[68906]: DEBUG oslo_vmware.api [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Task: {'id': task-3475361, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1168.180338] env[68906]: DEBUG oslo_vmware.exceptions [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Fault InvalidArgument not matched. {{(pid=68906) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1168.180641] env[68906]: DEBUG oslo_concurrency.lockutils [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1168.181431] env[68906]: ERROR nova.compute.manager [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1168.181431] env[68906]: Faults: ['InvalidArgument'] [ 1168.181431] env[68906]: ERROR nova.compute.manager [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Traceback (most recent call last): [ 1168.181431] env[68906]: ERROR nova.compute.manager [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1168.181431] env[68906]: ERROR nova.compute.manager [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] yield resources [ 1168.181431] env[68906]: ERROR nova.compute.manager [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1168.181431] env[68906]: ERROR nova.compute.manager [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] self.driver.spawn(context, instance, image_meta, [ 1168.181431] env[68906]: ERROR nova.compute.manager [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1168.181431] env[68906]: ERROR nova.compute.manager [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1168.181431] env[68906]: ERROR nova.compute.manager [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1168.181431] env[68906]: ERROR nova.compute.manager [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] self._fetch_image_if_missing(context, vi) [ 1168.181431] env[68906]: ERROR nova.compute.manager [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1168.181781] env[68906]: ERROR nova.compute.manager [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] image_cache(vi, tmp_image_ds_loc) [ 1168.181781] env[68906]: ERROR nova.compute.manager [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1168.181781] env[68906]: ERROR nova.compute.manager [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] vm_util.copy_virtual_disk( [ 1168.181781] env[68906]: ERROR nova.compute.manager [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1168.181781] env[68906]: ERROR nova.compute.manager [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] session._wait_for_task(vmdk_copy_task) [ 1168.181781] env[68906]: ERROR nova.compute.manager [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1168.181781] env[68906]: ERROR nova.compute.manager [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] return self.wait_for_task(task_ref) [ 1168.181781] env[68906]: ERROR nova.compute.manager [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1168.181781] env[68906]: ERROR nova.compute.manager [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] return evt.wait() [ 1168.181781] env[68906]: ERROR nova.compute.manager [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1168.181781] env[68906]: ERROR nova.compute.manager [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] result = hub.switch() [ 1168.181781] env[68906]: ERROR nova.compute.manager [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1168.181781] env[68906]: ERROR nova.compute.manager [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] return self.greenlet.switch() [ 1168.182192] env[68906]: ERROR nova.compute.manager [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1168.182192] env[68906]: ERROR nova.compute.manager [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] self.f(*self.args, **self.kw) [ 1168.182192] env[68906]: ERROR nova.compute.manager [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1168.182192] env[68906]: ERROR nova.compute.manager [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] raise exceptions.translate_fault(task_info.error) [ 1168.182192] env[68906]: ERROR nova.compute.manager [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1168.182192] env[68906]: ERROR nova.compute.manager [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Faults: ['InvalidArgument'] [ 1168.182192] env[68906]: ERROR nova.compute.manager [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] [ 1168.182192] env[68906]: INFO nova.compute.manager [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Terminating instance [ 1168.183338] env[68906]: DEBUG oslo_concurrency.lockutils [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1168.183550] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1168.183787] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-89953db0-ddc3-43f6-9d68-f49fc4727b73 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1168.185974] env[68906]: DEBUG nova.compute.manager [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1168.186207] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1168.187027] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cc08ce7b-2d69-4dc3-81ce-1793748cab9c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1168.193880] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Unregistering the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1168.194143] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-d2c44913-7d4d-48cd-b0ef-d601ac4c5d3d {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1168.198067] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1168.198067] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68906) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1168.198067] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-08e2913b-44e8-47a8-9cd7-2484bd7a28c2 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1168.202333] env[68906]: DEBUG oslo_vmware.api [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Waiting for the task: (returnval){ [ 1168.202333] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]52df435e-8ba7-1105-5449-bb43faedebcd" [ 1168.202333] env[68906]: _type = "Task" [ 1168.202333] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1168.209772] env[68906]: DEBUG oslo_vmware.api [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]52df435e-8ba7-1105-5449-bb43faedebcd, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1168.269025] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Unregistered the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1168.269025] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Deleting contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1168.269025] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Deleting the datastore file [datastore2] 9a2d2803-34b1-40f7-9349-e5734a217e18 {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1168.269025] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-4fd8002b-c838-49b7-a92e-92f216e3d446 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1168.275453] env[68906]: DEBUG oslo_vmware.api [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Waiting for the task: (returnval){ [ 1168.275453] env[68906]: value = "task-3475363" [ 1168.275453] env[68906]: _type = "Task" [ 1168.275453] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1168.285839] env[68906]: DEBUG oslo_vmware.api [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Task: {'id': task-3475363, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1168.712190] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Preparing fetch location {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1168.712488] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Creating directory with path [datastore2] vmware_temp/84c94c38-c8f4-422a-b6da-87da5da69cf5/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1168.712678] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-05a43d2d-31fb-4add-9d91-aa5b25e672e4 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1168.723728] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Created directory with path [datastore2] vmware_temp/84c94c38-c8f4-422a-b6da-87da5da69cf5/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1168.723915] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Fetch image to [datastore2] vmware_temp/84c94c38-c8f4-422a-b6da-87da5da69cf5/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1168.724098] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to [datastore2] vmware_temp/84c94c38-c8f4-422a-b6da-87da5da69cf5/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1168.724794] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e2d34755-9b2b-4e40-aef7-22751a4407ef {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1168.731361] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a8d7677f-9722-4187-98ac-467546017c16 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1168.740069] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1f2d93ec-ad15-4e7e-9a52-eb7464f333a4 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1168.769633] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48a0be7a-4c31-4fb6-9bee-7a2124ad1205 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1168.774845] env[68906]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-1e26f3e4-74d9-4b5e-bddd-736b1f6f1868 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1168.783376] env[68906]: DEBUG oslo_vmware.api [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Task: {'id': task-3475363, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.075733} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1168.783598] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Deleted the datastore file {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1168.783773] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Deleted contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1168.783943] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1168.784137] env[68906]: INFO nova.compute.manager [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1168.786193] env[68906]: DEBUG nova.compute.claims [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Aborting claim: {{(pid=68906) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1168.786397] env[68906]: DEBUG oslo_concurrency.lockutils [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1168.786619] env[68906]: DEBUG oslo_concurrency.lockutils [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1168.805460] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1168.854103] env[68906]: DEBUG oslo_vmware.rw_handles [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/84c94c38-c8f4-422a-b6da-87da5da69cf5/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1168.914785] env[68906]: DEBUG oslo_vmware.rw_handles [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Completed reading data from the image iterator. {{(pid=68906) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1168.915032] env[68906]: DEBUG oslo_vmware.rw_handles [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/84c94c38-c8f4-422a-b6da-87da5da69cf5/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1169.174950] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd1e0263-9dbd-44ed-bfd6-9894aaa85cc3 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1169.182737] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f024b0d2-5cdd-4d75-b759-0b61a669347e {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1169.212964] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c1e331e8-b621-4a82-9224-0214eb6651ba {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1169.220332] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7535bc39-fc51-4ef9-8fa2-673556f20317 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1169.233218] env[68906]: DEBUG nova.compute.provider_tree [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1169.244860] env[68906]: DEBUG nova.scheduler.client.report [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1169.258731] env[68906]: DEBUG oslo_concurrency.lockutils [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.472s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1169.259255] env[68906]: ERROR nova.compute.manager [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1169.259255] env[68906]: Faults: ['InvalidArgument'] [ 1169.259255] env[68906]: ERROR nova.compute.manager [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Traceback (most recent call last): [ 1169.259255] env[68906]: ERROR nova.compute.manager [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1169.259255] env[68906]: ERROR nova.compute.manager [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] self.driver.spawn(context, instance, image_meta, [ 1169.259255] env[68906]: ERROR nova.compute.manager [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1169.259255] env[68906]: ERROR nova.compute.manager [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1169.259255] env[68906]: ERROR nova.compute.manager [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1169.259255] env[68906]: ERROR nova.compute.manager [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] self._fetch_image_if_missing(context, vi) [ 1169.259255] env[68906]: ERROR nova.compute.manager [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1169.259255] env[68906]: ERROR nova.compute.manager [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] image_cache(vi, tmp_image_ds_loc) [ 1169.259255] env[68906]: ERROR nova.compute.manager [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1169.259621] env[68906]: ERROR nova.compute.manager [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] vm_util.copy_virtual_disk( [ 1169.259621] env[68906]: ERROR nova.compute.manager [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1169.259621] env[68906]: ERROR nova.compute.manager [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] session._wait_for_task(vmdk_copy_task) [ 1169.259621] env[68906]: ERROR nova.compute.manager [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1169.259621] env[68906]: ERROR nova.compute.manager [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] return self.wait_for_task(task_ref) [ 1169.259621] env[68906]: ERROR nova.compute.manager [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1169.259621] env[68906]: ERROR nova.compute.manager [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] return evt.wait() [ 1169.259621] env[68906]: ERROR nova.compute.manager [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1169.259621] env[68906]: ERROR nova.compute.manager [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] result = hub.switch() [ 1169.259621] env[68906]: ERROR nova.compute.manager [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1169.259621] env[68906]: ERROR nova.compute.manager [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] return self.greenlet.switch() [ 1169.259621] env[68906]: ERROR nova.compute.manager [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1169.259621] env[68906]: ERROR nova.compute.manager [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] self.f(*self.args, **self.kw) [ 1169.259978] env[68906]: ERROR nova.compute.manager [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1169.259978] env[68906]: ERROR nova.compute.manager [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] raise exceptions.translate_fault(task_info.error) [ 1169.259978] env[68906]: ERROR nova.compute.manager [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1169.259978] env[68906]: ERROR nova.compute.manager [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Faults: ['InvalidArgument'] [ 1169.259978] env[68906]: ERROR nova.compute.manager [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] [ 1169.259978] env[68906]: DEBUG nova.compute.utils [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] VimFaultException {{(pid=68906) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1169.262519] env[68906]: DEBUG nova.compute.manager [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Build of instance 9a2d2803-34b1-40f7-9349-e5734a217e18 was re-scheduled: A specified parameter was not correct: fileType [ 1169.262519] env[68906]: Faults: ['InvalidArgument'] {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1169.262887] env[68906]: DEBUG nova.compute.manager [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Unplugging VIFs for instance {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1169.263085] env[68906]: DEBUG nova.compute.manager [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1169.263244] env[68906]: DEBUG nova.compute.manager [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1169.263406] env[68906]: DEBUG nova.network.neutron [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1169.839461] env[68906]: DEBUG nova.network.neutron [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1169.851033] env[68906]: INFO nova.compute.manager [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Took 0.59 seconds to deallocate network for instance. [ 1169.956196] env[68906]: INFO nova.scheduler.client.report [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Deleted allocations for instance 9a2d2803-34b1-40f7-9349-e5734a217e18 [ 1169.974881] env[68906]: DEBUG oslo_concurrency.lockutils [None req-237bd299-a887-4870-a49f-dd296ebfa92b tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Lock "9a2d2803-34b1-40f7-9349-e5734a217e18" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 521.977s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1169.976197] env[68906]: DEBUG oslo_concurrency.lockutils [None req-3688da09-dcdf-4b50-97dc-63e8cf24e699 tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Lock "9a2d2803-34b1-40f7-9349-e5734a217e18" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 324.151s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1169.976449] env[68906]: DEBUG oslo_concurrency.lockutils [None req-3688da09-dcdf-4b50-97dc-63e8cf24e699 tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Acquiring lock "9a2d2803-34b1-40f7-9349-e5734a217e18-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1169.976662] env[68906]: DEBUG oslo_concurrency.lockutils [None req-3688da09-dcdf-4b50-97dc-63e8cf24e699 tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Lock "9a2d2803-34b1-40f7-9349-e5734a217e18-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1169.976833] env[68906]: DEBUG oslo_concurrency.lockutils [None req-3688da09-dcdf-4b50-97dc-63e8cf24e699 tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Lock "9a2d2803-34b1-40f7-9349-e5734a217e18-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1169.978792] env[68906]: INFO nova.compute.manager [None req-3688da09-dcdf-4b50-97dc-63e8cf24e699 tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Terminating instance [ 1169.980513] env[68906]: DEBUG nova.compute.manager [None req-3688da09-dcdf-4b50-97dc-63e8cf24e699 tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1169.980710] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-3688da09-dcdf-4b50-97dc-63e8cf24e699 tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1169.981184] env[68906]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-d52dacd3-12ce-4973-8792-4088ad7f5d60 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1169.990458] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5a479e92-e10e-4eb6-800a-d0111731d522 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1170.001416] env[68906]: DEBUG nova.compute.manager [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: db011373-7285-4882-8bce-d39cfa22fe80] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1170.024105] env[68906]: WARNING nova.virt.vmwareapi.vmops [None req-3688da09-dcdf-4b50-97dc-63e8cf24e699 tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 9a2d2803-34b1-40f7-9349-e5734a217e18 could not be found. [ 1170.024317] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-3688da09-dcdf-4b50-97dc-63e8cf24e699 tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1170.024491] env[68906]: INFO nova.compute.manager [None req-3688da09-dcdf-4b50-97dc-63e8cf24e699 tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1170.024729] env[68906]: DEBUG oslo.service.loopingcall [None req-3688da09-dcdf-4b50-97dc-63e8cf24e699 tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1170.024943] env[68906]: DEBUG nova.compute.manager [-] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1170.025051] env[68906]: DEBUG nova.network.neutron [-] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1170.049800] env[68906]: DEBUG nova.network.neutron [-] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1170.052695] env[68906]: DEBUG oslo_concurrency.lockutils [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1170.052924] env[68906]: DEBUG oslo_concurrency.lockutils [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1170.054400] env[68906]: INFO nova.compute.claims [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: db011373-7285-4882-8bce-d39cfa22fe80] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1170.057488] env[68906]: INFO nova.compute.manager [-] [instance: 9a2d2803-34b1-40f7-9349-e5734a217e18] Took 0.03 seconds to deallocate network for instance. [ 1170.138105] env[68906]: DEBUG oslo_concurrency.lockutils [None req-3688da09-dcdf-4b50-97dc-63e8cf24e699 tempest-VolumesAssistedSnapshotsTest-429909811 tempest-VolumesAssistedSnapshotsTest-429909811-project-member] Lock "9a2d2803-34b1-40f7-9349-e5734a217e18" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.162s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1170.378145] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1d82c5d5-201e-494a-93eb-0b73b7c3bfc6 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1170.385810] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-962f772b-7aef-4258-9234-202f491b0f1c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1170.416082] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a04f569e-534d-4ef1-bf7a-db7dc199bc58 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1170.422491] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2faaa086-7745-49fb-b856-71b26dd0ee8a {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1170.435718] env[68906]: DEBUG nova.compute.provider_tree [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1170.445028] env[68906]: DEBUG nova.scheduler.client.report [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1170.460453] env[68906]: DEBUG oslo_concurrency.lockutils [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.407s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1170.460941] env[68906]: DEBUG nova.compute.manager [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: db011373-7285-4882-8bce-d39cfa22fe80] Start building networks asynchronously for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1170.499053] env[68906]: DEBUG nova.compute.utils [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Using /dev/sd instead of None {{(pid=68906) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1170.500745] env[68906]: DEBUG nova.compute.manager [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: db011373-7285-4882-8bce-d39cfa22fe80] Allocating IP information in the background. {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1170.500745] env[68906]: DEBUG nova.network.neutron [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: db011373-7285-4882-8bce-d39cfa22fe80] allocate_for_instance() {{(pid=68906) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1170.508701] env[68906]: DEBUG nova.compute.manager [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: db011373-7285-4882-8bce-d39cfa22fe80] Start building block device mappings for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1170.577928] env[68906]: DEBUG nova.compute.manager [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: db011373-7285-4882-8bce-d39cfa22fe80] Start spawning the instance on the hypervisor. {{(pid=68906) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1170.604697] env[68906]: DEBUG nova.policy [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c08a6c439ba94d18b742a133848aaaae', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0e206dedfb584e219a7f5dd633032515', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68906) authorize /opt/stack/nova/nova/policy.py:203}} [ 1170.608298] env[68906]: DEBUG nova.virt.hardware [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T13:00:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T13:00:23Z,direct_url=,disk_format='vmdk',id=b1400c31-d33b-4e13-944f-4c645e62493e,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='1ae7bf3a375d41c6af5e7536af51ffd1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T13:00:24Z,virtual_size=,visibility=), allow threads: False {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1170.608539] env[68906]: DEBUG nova.virt.hardware [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Flavor limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1170.608697] env[68906]: DEBUG nova.virt.hardware [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Image limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1170.608880] env[68906]: DEBUG nova.virt.hardware [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Flavor pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1170.609041] env[68906]: DEBUG nova.virt.hardware [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Image pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1170.609251] env[68906]: DEBUG nova.virt.hardware [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1170.609481] env[68906]: DEBUG nova.virt.hardware [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1170.609645] env[68906]: DEBUG nova.virt.hardware [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1170.609809] env[68906]: DEBUG nova.virt.hardware [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Got 1 possible topologies {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1170.609969] env[68906]: DEBUG nova.virt.hardware [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1170.610175] env[68906]: DEBUG nova.virt.hardware [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1170.611032] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d7aa816-35fc-413c-b637-9e07b9cf5614 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1170.619550] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce46e51f-95a5-430f-83bb-bb3e6fdba539 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1171.107963] env[68906]: DEBUG nova.network.neutron [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: db011373-7285-4882-8bce-d39cfa22fe80] Successfully created port: 829752c1-acd9-470c-9db8-e6e5b831f39d {{(pid=68906) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1172.157719] env[68906]: DEBUG nova.network.neutron [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: db011373-7285-4882-8bce-d39cfa22fe80] Successfully updated port: 829752c1-acd9-470c-9db8-e6e5b831f39d {{(pid=68906) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1172.168697] env[68906]: DEBUG oslo_concurrency.lockutils [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Acquiring lock "refresh_cache-db011373-7285-4882-8bce-d39cfa22fe80" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1172.168860] env[68906]: DEBUG oslo_concurrency.lockutils [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Acquired lock "refresh_cache-db011373-7285-4882-8bce-d39cfa22fe80" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1172.169032] env[68906]: DEBUG nova.network.neutron [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: db011373-7285-4882-8bce-d39cfa22fe80] Building network info cache for instance {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1172.222584] env[68906]: DEBUG nova.compute.manager [req-ad564909-dde7-4059-968f-f54934d2b951 req-3997cae1-8585-4857-891e-dd9c758e329b service nova] [instance: db011373-7285-4882-8bce-d39cfa22fe80] Received event network-vif-plugged-829752c1-acd9-470c-9db8-e6e5b831f39d {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1172.222802] env[68906]: DEBUG oslo_concurrency.lockutils [req-ad564909-dde7-4059-968f-f54934d2b951 req-3997cae1-8585-4857-891e-dd9c758e329b service nova] Acquiring lock "db011373-7285-4882-8bce-d39cfa22fe80-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1172.223037] env[68906]: DEBUG oslo_concurrency.lockutils [req-ad564909-dde7-4059-968f-f54934d2b951 req-3997cae1-8585-4857-891e-dd9c758e329b service nova] Lock "db011373-7285-4882-8bce-d39cfa22fe80-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1172.223229] env[68906]: DEBUG oslo_concurrency.lockutils [req-ad564909-dde7-4059-968f-f54934d2b951 req-3997cae1-8585-4857-891e-dd9c758e329b service nova] Lock "db011373-7285-4882-8bce-d39cfa22fe80-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1172.223401] env[68906]: DEBUG nova.compute.manager [req-ad564909-dde7-4059-968f-f54934d2b951 req-3997cae1-8585-4857-891e-dd9c758e329b service nova] [instance: db011373-7285-4882-8bce-d39cfa22fe80] No waiting events found dispatching network-vif-plugged-829752c1-acd9-470c-9db8-e6e5b831f39d {{(pid=68906) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1172.223608] env[68906]: WARNING nova.compute.manager [req-ad564909-dde7-4059-968f-f54934d2b951 req-3997cae1-8585-4857-891e-dd9c758e329b service nova] [instance: db011373-7285-4882-8bce-d39cfa22fe80] Received unexpected event network-vif-plugged-829752c1-acd9-470c-9db8-e6e5b831f39d for instance with vm_state building and task_state spawning. [ 1172.237997] env[68906]: DEBUG nova.network.neutron [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: db011373-7285-4882-8bce-d39cfa22fe80] Instance cache missing network info. {{(pid=68906) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1172.530334] env[68906]: DEBUG nova.network.neutron [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: db011373-7285-4882-8bce-d39cfa22fe80] Updating instance_info_cache with network_info: [{"id": "829752c1-acd9-470c-9db8-e6e5b831f39d", "address": "fa:16:3e:69:40:2d", "network": {"id": "c9025f67-c9f7-4312-b2bd-5fbb06647b07", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-9371784-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0e206dedfb584e219a7f5dd633032515", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f16a5584-aed0-4df4-820b-5e7f15977265", "external-id": "cl2-zone-495", "segmentation_id": 495, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap829752c1-ac", "ovs_interfaceid": "829752c1-acd9-470c-9db8-e6e5b831f39d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1172.544636] env[68906]: DEBUG oslo_concurrency.lockutils [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Releasing lock "refresh_cache-db011373-7285-4882-8bce-d39cfa22fe80" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1172.544934] env[68906]: DEBUG nova.compute.manager [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: db011373-7285-4882-8bce-d39cfa22fe80] Instance network_info: |[{"id": "829752c1-acd9-470c-9db8-e6e5b831f39d", "address": "fa:16:3e:69:40:2d", "network": {"id": "c9025f67-c9f7-4312-b2bd-5fbb06647b07", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-9371784-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0e206dedfb584e219a7f5dd633032515", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f16a5584-aed0-4df4-820b-5e7f15977265", "external-id": "cl2-zone-495", "segmentation_id": 495, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap829752c1-ac", "ovs_interfaceid": "829752c1-acd9-470c-9db8-e6e5b831f39d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1172.545350] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: db011373-7285-4882-8bce-d39cfa22fe80] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:69:40:2d', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'f16a5584-aed0-4df4-820b-5e7f15977265', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '829752c1-acd9-470c-9db8-e6e5b831f39d', 'vif_model': 'vmxnet3'}] {{(pid=68906) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1172.553344] env[68906]: DEBUG oslo.service.loopingcall [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1172.553829] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: db011373-7285-4882-8bce-d39cfa22fe80] Creating VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1172.554075] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-a060462d-b67b-4701-a2d6-90843ee80a7e {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1172.574727] env[68906]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1172.574727] env[68906]: value = "task-3475364" [ 1172.574727] env[68906]: _type = "Task" [ 1172.574727] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1172.581871] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475364, 'name': CreateVM_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1173.084739] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475364, 'name': CreateVM_Task, 'duration_secs': 0.292199} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1173.084920] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: db011373-7285-4882-8bce-d39cfa22fe80] Created VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1173.085607] env[68906]: DEBUG oslo_concurrency.lockutils [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1173.085783] env[68906]: DEBUG oslo_concurrency.lockutils [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1173.086134] env[68906]: DEBUG oslo_concurrency.lockutils [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1173.086393] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0e8c1fea-d00a-4745-98c6-1aca8d356bdb {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1173.090762] env[68906]: DEBUG oslo_vmware.api [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Waiting for the task: (returnval){ [ 1173.090762] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]521ccd9b-59a2-1f9f-e5aa-b5f2b5d48ae7" [ 1173.090762] env[68906]: _type = "Task" [ 1173.090762] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1173.098229] env[68906]: DEBUG oslo_vmware.api [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]521ccd9b-59a2-1f9f-e5aa-b5f2b5d48ae7, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1173.602594] env[68906]: DEBUG oslo_concurrency.lockutils [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1173.602894] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: db011373-7285-4882-8bce-d39cfa22fe80] Processing image b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1173.603146] env[68906]: DEBUG oslo_concurrency.lockutils [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1174.141024] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1174.290206] env[68906]: DEBUG nova.compute.manager [req-c586704b-43dd-4948-bf09-3fa2561fa53a req-4938930c-0cf0-440d-a90f-f22e88d27e31 service nova] [instance: db011373-7285-4882-8bce-d39cfa22fe80] Received event network-changed-829752c1-acd9-470c-9db8-e6e5b831f39d {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1174.290457] env[68906]: DEBUG nova.compute.manager [req-c586704b-43dd-4948-bf09-3fa2561fa53a req-4938930c-0cf0-440d-a90f-f22e88d27e31 service nova] [instance: db011373-7285-4882-8bce-d39cfa22fe80] Refreshing instance network info cache due to event network-changed-829752c1-acd9-470c-9db8-e6e5b831f39d. {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1174.290644] env[68906]: DEBUG oslo_concurrency.lockutils [req-c586704b-43dd-4948-bf09-3fa2561fa53a req-4938930c-0cf0-440d-a90f-f22e88d27e31 service nova] Acquiring lock "refresh_cache-db011373-7285-4882-8bce-d39cfa22fe80" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1174.290793] env[68906]: DEBUG oslo_concurrency.lockutils [req-c586704b-43dd-4948-bf09-3fa2561fa53a req-4938930c-0cf0-440d-a90f-f22e88d27e31 service nova] Acquired lock "refresh_cache-db011373-7285-4882-8bce-d39cfa22fe80" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1174.290955] env[68906]: DEBUG nova.network.neutron [req-c586704b-43dd-4948-bf09-3fa2561fa53a req-4938930c-0cf0-440d-a90f-f22e88d27e31 service nova] [instance: db011373-7285-4882-8bce-d39cfa22fe80] Refreshing network info cache for port 829752c1-acd9-470c-9db8-e6e5b831f39d {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1174.681751] env[68906]: DEBUG nova.network.neutron [req-c586704b-43dd-4948-bf09-3fa2561fa53a req-4938930c-0cf0-440d-a90f-f22e88d27e31 service nova] [instance: db011373-7285-4882-8bce-d39cfa22fe80] Updated VIF entry in instance network info cache for port 829752c1-acd9-470c-9db8-e6e5b831f39d. {{(pid=68906) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1174.682116] env[68906]: DEBUG nova.network.neutron [req-c586704b-43dd-4948-bf09-3fa2561fa53a req-4938930c-0cf0-440d-a90f-f22e88d27e31 service nova] [instance: db011373-7285-4882-8bce-d39cfa22fe80] Updating instance_info_cache with network_info: [{"id": "829752c1-acd9-470c-9db8-e6e5b831f39d", "address": "fa:16:3e:69:40:2d", "network": {"id": "c9025f67-c9f7-4312-b2bd-5fbb06647b07", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-9371784-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0e206dedfb584e219a7f5dd633032515", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f16a5584-aed0-4df4-820b-5e7f15977265", "external-id": "cl2-zone-495", "segmentation_id": 495, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap829752c1-ac", "ovs_interfaceid": "829752c1-acd9-470c-9db8-e6e5b831f39d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1174.691994] env[68906]: DEBUG oslo_concurrency.lockutils [req-c586704b-43dd-4948-bf09-3fa2561fa53a req-4938930c-0cf0-440d-a90f-f22e88d27e31 service nova] Releasing lock "refresh_cache-db011373-7285-4882-8bce-d39cfa22fe80" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1175.149056] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1175.149056] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Cleaning up deleted instances with incomplete migration {{(pid=68906) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 1176.146048] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1177.140076] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1177.140338] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1177.937962] env[68906]: DEBUG oslo_concurrency.lockutils [None req-84b4443f-5b82-4301-af42-70e59908a5fd tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Acquiring lock "db011373-7285-4882-8bce-d39cfa22fe80" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1179.141164] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1179.141476] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1179.141476] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Cleaning up deleted instances {{(pid=68906) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 1179.153839] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] There are 0 instances to clean {{(pid=68906) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 1181.148097] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1181.148414] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1181.148516] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68906) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1182.140585] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1182.140780] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Starting heal instance info cache {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1182.140906] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Rebuilding the list of instances to heal {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1182.162047] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1182.162328] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1182.162454] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1182.162595] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1182.162720] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: acc11633-a489-4d8f-ad76-f17049a91545] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1182.162842] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1182.162963] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1182.163191] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1182.163255] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1182.163329] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: db011373-7285-4882-8bce-d39cfa22fe80] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1182.163451] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Didn't find any instances for network info cache update. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1184.140732] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1184.140732] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1184.141148] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager.update_available_resource {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1184.152170] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1184.152393] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1184.152555] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1184.152707] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68906) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1184.153817] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e2a1dba-c02a-46e4-8f47-4b2f652a4591 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1184.164025] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db10f2b4-9762-43df-b80e-700554dd2d06 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1184.178383] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5275e1eb-83a1-41fb-9d8c-e78b86339190 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1184.184758] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44d8efb8-f49b-4e25-9b2d-d6235108326f {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1184.213473] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180974MB free_disk=93GB free_vcpus=48 pci_devices=None {{(pid=68906) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1184.213687] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1184.213924] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1184.361738] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 13eebe4e-5984-46c3-bb73-cd783ad45df6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1184.361916] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance a7e0a28f-42a5-442e-b962-07771d2e6a27 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1184.362062] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance eb81e9b1-b573-4d7c-9ede-f8b32a43a201 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1184.362219] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1184.362347] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance acc11633-a489-4d8f-ad76-f17049a91545 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1184.362468] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance e7286888-d79d-4632-9c06-69c1ef47fa50 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1184.362585] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 641cca5b-d749-4331-a5e0-8acb6d47cba2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1184.362701] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1184.362814] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 4d36bb91-0cde-44cb-8706-d17740a9cf50 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1184.362924] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance db011373-7285-4882-8bce-d39cfa22fe80 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1184.374951] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 1fdb401a-ac25-4418-803c-fc0b2297f2d4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1184.389262] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance d6be39b6-8bbc-4657-9ceb-9a4110c29c53 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1184.399041] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 4a616d87-7b55-4b1f-b938-9d9261e8b2cd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1184.408759] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 38248e62-53b8-402e-aa29-d9a445b0af9d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1184.419688] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 60ba9060-c3c3-4561-b9e9-e2df08e2e38b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1184.429532] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance e8a14af6-ab4f-407e-943d-4dd3a46c8711 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1184.439377] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 57078f52-8070-480e-b8ea-278ef759f0a3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1184.449337] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1184.459910] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance e0e595e3-e47e-4cf1-8977-f004eca942d1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1184.470387] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 3c36e8a4-da45-457e-b4ef-001f4a4e595f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1184.480473] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 582a086e-5122-41f2-8fb8-513b3734eef4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1184.493251] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 159edc16-55bb-46eb-8fa9-7da7c1f36cd0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1184.503713] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 13b471c5-c86e-4b55-a231-159b2219de2f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1184.516745] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance d01b8b11-bc3b-47dc-8687-a111c1453ed9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1184.526439] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 7466df8a-59a9-49b9-bff7-c4efbeae3eee has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1184.526782] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1184.526989] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1184.543384] env[68906]: DEBUG nova.scheduler.client.report [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Refreshing inventories for resource provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1184.556983] env[68906]: DEBUG nova.scheduler.client.report [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Updating ProviderTree inventory for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1184.557196] env[68906]: DEBUG nova.compute.provider_tree [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Updating inventory in ProviderTree for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1184.571258] env[68906]: DEBUG nova.scheduler.client.report [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Refreshing aggregate associations for resource provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b, aggregates: None {{(pid=68906) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1184.592728] env[68906]: DEBUG nova.scheduler.client.report [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Refreshing trait associations for resource provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b, traits: COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ISO {{(pid=68906) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1184.741016] env[68906]: DEBUG oslo_concurrency.lockutils [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Acquiring lock "89171680-c76d-4826-9236-379542661ffb" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1184.741265] env[68906]: DEBUG oslo_concurrency.lockutils [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Lock "89171680-c76d-4826-9236-379542661ffb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1184.851285] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b700c02f-53f1-4bb9-a446-b01c626b91a3 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1184.859016] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e8523569-925c-4569-8147-057ecaf20ba8 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1184.889822] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e85de67f-c892-4b35-acb3-59a89d122c92 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1184.896386] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5e60c768-7b09-4e9a-b6f1-c4eecb94da95 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1184.909930] env[68906]: DEBUG nova.compute.provider_tree [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1184.917932] env[68906]: DEBUG nova.scheduler.client.report [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1184.930824] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68906) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1184.931010] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.717s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1210.903752] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._sync_power_states {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1210.930456] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Getting list of instances from cluster (obj){ [ 1210.930456] env[68906]: value = "domain-c8" [ 1210.930456] env[68906]: _type = "ClusterComputeResource" [ 1210.930456] env[68906]: } {{(pid=68906) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 1210.931728] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-77607488-886a-472c-ba52-d808ac364470 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1210.949233] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Got total of 10 instances {{(pid=68906) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 1210.949382] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Triggering sync for uuid 13eebe4e-5984-46c3-bb73-cd783ad45df6 {{(pid=68906) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1210.949578] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Triggering sync for uuid a7e0a28f-42a5-442e-b962-07771d2e6a27 {{(pid=68906) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1210.949738] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Triggering sync for uuid eb81e9b1-b573-4d7c-9ede-f8b32a43a201 {{(pid=68906) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1210.949895] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Triggering sync for uuid 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63 {{(pid=68906) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1210.950057] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Triggering sync for uuid acc11633-a489-4d8f-ad76-f17049a91545 {{(pid=68906) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1210.950213] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Triggering sync for uuid e7286888-d79d-4632-9c06-69c1ef47fa50 {{(pid=68906) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1210.950363] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Triggering sync for uuid 641cca5b-d749-4331-a5e0-8acb6d47cba2 {{(pid=68906) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1210.950590] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Triggering sync for uuid 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a {{(pid=68906) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1210.950747] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Triggering sync for uuid 4d36bb91-0cde-44cb-8706-d17740a9cf50 {{(pid=68906) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1210.950896] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Triggering sync for uuid db011373-7285-4882-8bce-d39cfa22fe80 {{(pid=68906) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1210.951266] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "13eebe4e-5984-46c3-bb73-cd783ad45df6" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1210.951509] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "a7e0a28f-42a5-442e-b962-07771d2e6a27" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1210.951712] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "eb81e9b1-b573-4d7c-9ede-f8b32a43a201" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1210.951911] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1210.952121] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "acc11633-a489-4d8f-ad76-f17049a91545" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1210.952317] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "e7286888-d79d-4632-9c06-69c1ef47fa50" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1210.952508] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "641cca5b-d749-4331-a5e0-8acb6d47cba2" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1210.952696] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1210.952895] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "4d36bb91-0cde-44cb-8706-d17740a9cf50" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1210.953097] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "db011373-7285-4882-8bce-d39cfa22fe80" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1215.231032] env[68906]: WARNING oslo_vmware.rw_handles [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1215.231032] env[68906]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1215.231032] env[68906]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1215.231032] env[68906]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1215.231032] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1215.231032] env[68906]: ERROR oslo_vmware.rw_handles response.begin() [ 1215.231032] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1215.231032] env[68906]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1215.231032] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1215.231032] env[68906]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1215.231032] env[68906]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1215.231032] env[68906]: ERROR oslo_vmware.rw_handles [ 1215.231032] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Downloaded image file data b1400c31-d33b-4e13-944f-4c645e62493e to vmware_temp/84c94c38-c8f4-422a-b6da-87da5da69cf5/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1215.233587] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Caching image {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1215.233846] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Copying Virtual Disk [datastore2] vmware_temp/84c94c38-c8f4-422a-b6da-87da5da69cf5/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk to [datastore2] vmware_temp/84c94c38-c8f4-422a-b6da-87da5da69cf5/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk {{(pid=68906) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1215.234149] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-ce569067-9c9c-445b-861e-48269a67b639 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1215.242112] env[68906]: DEBUG oslo_vmware.api [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Waiting for the task: (returnval){ [ 1215.242112] env[68906]: value = "task-3475365" [ 1215.242112] env[68906]: _type = "Task" [ 1215.242112] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1215.249817] env[68906]: DEBUG oslo_vmware.api [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Task: {'id': task-3475365, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1215.752254] env[68906]: DEBUG oslo_vmware.exceptions [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Fault InvalidArgument not matched. {{(pid=68906) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1215.752545] env[68906]: DEBUG oslo_concurrency.lockutils [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1215.753139] env[68906]: ERROR nova.compute.manager [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1215.753139] env[68906]: Faults: ['InvalidArgument'] [ 1215.753139] env[68906]: ERROR nova.compute.manager [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Traceback (most recent call last): [ 1215.753139] env[68906]: ERROR nova.compute.manager [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1215.753139] env[68906]: ERROR nova.compute.manager [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] yield resources [ 1215.753139] env[68906]: ERROR nova.compute.manager [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1215.753139] env[68906]: ERROR nova.compute.manager [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] self.driver.spawn(context, instance, image_meta, [ 1215.753139] env[68906]: ERROR nova.compute.manager [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1215.753139] env[68906]: ERROR nova.compute.manager [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1215.753139] env[68906]: ERROR nova.compute.manager [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1215.753139] env[68906]: ERROR nova.compute.manager [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] self._fetch_image_if_missing(context, vi) [ 1215.753139] env[68906]: ERROR nova.compute.manager [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1215.753524] env[68906]: ERROR nova.compute.manager [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] image_cache(vi, tmp_image_ds_loc) [ 1215.753524] env[68906]: ERROR nova.compute.manager [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1215.753524] env[68906]: ERROR nova.compute.manager [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] vm_util.copy_virtual_disk( [ 1215.753524] env[68906]: ERROR nova.compute.manager [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1215.753524] env[68906]: ERROR nova.compute.manager [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] session._wait_for_task(vmdk_copy_task) [ 1215.753524] env[68906]: ERROR nova.compute.manager [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1215.753524] env[68906]: ERROR nova.compute.manager [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] return self.wait_for_task(task_ref) [ 1215.753524] env[68906]: ERROR nova.compute.manager [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1215.753524] env[68906]: ERROR nova.compute.manager [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] return evt.wait() [ 1215.753524] env[68906]: ERROR nova.compute.manager [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1215.753524] env[68906]: ERROR nova.compute.manager [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] result = hub.switch() [ 1215.753524] env[68906]: ERROR nova.compute.manager [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1215.753524] env[68906]: ERROR nova.compute.manager [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] return self.greenlet.switch() [ 1215.753919] env[68906]: ERROR nova.compute.manager [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1215.753919] env[68906]: ERROR nova.compute.manager [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] self.f(*self.args, **self.kw) [ 1215.753919] env[68906]: ERROR nova.compute.manager [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1215.753919] env[68906]: ERROR nova.compute.manager [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] raise exceptions.translate_fault(task_info.error) [ 1215.753919] env[68906]: ERROR nova.compute.manager [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1215.753919] env[68906]: ERROR nova.compute.manager [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Faults: ['InvalidArgument'] [ 1215.753919] env[68906]: ERROR nova.compute.manager [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] [ 1215.753919] env[68906]: INFO nova.compute.manager [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Terminating instance [ 1215.755014] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1215.755280] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1215.755524] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0c6ed95b-7397-434d-a07c-ed517474c1fb {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1215.758010] env[68906]: DEBUG nova.compute.manager [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1215.758264] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1215.758993] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7640b17d-d923-420c-b129-a011a98123de {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1215.766369] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Unregistering the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1215.766685] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-c38917cd-8ce2-434c-8d3e-b3775678455b {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1215.768938] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1215.769147] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68906) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1215.770168] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3716a13b-87f5-4b87-9eb2-9780bbcdc4ff {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1215.774829] env[68906]: DEBUG oslo_vmware.api [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Waiting for the task: (returnval){ [ 1215.774829] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]52848fc8-3071-3069-f6fd-d8803dd7d7cd" [ 1215.774829] env[68906]: _type = "Task" [ 1215.774829] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1215.781926] env[68906]: DEBUG oslo_vmware.api [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]52848fc8-3071-3069-f6fd-d8803dd7d7cd, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1215.834910] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Unregistered the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1215.835254] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Deleting contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1215.835496] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Deleting the datastore file [datastore2] 13eebe4e-5984-46c3-bb73-cd783ad45df6 {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1215.835806] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-1fd831d4-ffb3-4891-9573-f8c400b9577c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1215.842215] env[68906]: DEBUG oslo_vmware.api [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Waiting for the task: (returnval){ [ 1215.842215] env[68906]: value = "task-3475367" [ 1215.842215] env[68906]: _type = "Task" [ 1215.842215] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1215.849723] env[68906]: DEBUG oslo_vmware.api [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Task: {'id': task-3475367, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1216.284540] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Preparing fetch location {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1216.284826] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Creating directory with path [datastore2] vmware_temp/0adcb25d-1777-4154-9875-507d27045746/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1216.285035] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-41e95baa-3dd5-417d-81e0-eb10133b603c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1216.296474] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Created directory with path [datastore2] vmware_temp/0adcb25d-1777-4154-9875-507d27045746/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1216.296783] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Fetch image to [datastore2] vmware_temp/0adcb25d-1777-4154-9875-507d27045746/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1216.296995] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to [datastore2] vmware_temp/0adcb25d-1777-4154-9875-507d27045746/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1216.297830] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ead972b3-c4e4-4984-aaaf-fd762cd19345 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1216.304433] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-feef6f4a-c60c-48cb-b462-d2162503fe41 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1216.314086] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4e43d61d-b4d0-44d7-b498-f98d0e74f6e1 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1216.348285] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2e01017a-70ab-4f5a-bbc1-8c530de686ce {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1216.354900] env[68906]: DEBUG oslo_vmware.api [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Task: {'id': task-3475367, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.064238} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1216.356494] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Deleted the datastore file {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1216.356729] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Deleted contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1216.356911] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1216.357101] env[68906]: INFO nova.compute.manager [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1216.359355] env[68906]: DEBUG nova.compute.claims [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Aborting claim: {{(pid=68906) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1216.359522] env[68906]: DEBUG oslo_concurrency.lockutils [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1216.359747] env[68906]: DEBUG oslo_concurrency.lockutils [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1216.362211] env[68906]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-b521ba50-4ded-4c1a-aa67-cffbd82da43a {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1216.386083] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1216.438596] env[68906]: DEBUG oslo_vmware.rw_handles [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0adcb25d-1777-4154-9875-507d27045746/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1216.498727] env[68906]: DEBUG oslo_vmware.rw_handles [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Completed reading data from the image iterator. {{(pid=68906) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1216.498922] env[68906]: DEBUG oslo_vmware.rw_handles [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0adcb25d-1777-4154-9875-507d27045746/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1216.737056] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-859e1de6-6f8c-499c-8e02-3353dd6c0eb3 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1216.743980] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ef93b0e-e3b8-4d1a-b786-1fb61ef1057a {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1216.773771] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-74be77be-0d3b-466b-b922-6cf8a39b0296 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1216.780294] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3bf6fe0d-eeec-4db3-b30d-cf4fad48db5b {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1216.793527] env[68906]: DEBUG nova.compute.provider_tree [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1216.801883] env[68906]: DEBUG nova.scheduler.client.report [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1216.815564] env[68906]: DEBUG oslo_concurrency.lockutils [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.456s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1216.816146] env[68906]: ERROR nova.compute.manager [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1216.816146] env[68906]: Faults: ['InvalidArgument'] [ 1216.816146] env[68906]: ERROR nova.compute.manager [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Traceback (most recent call last): [ 1216.816146] env[68906]: ERROR nova.compute.manager [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1216.816146] env[68906]: ERROR nova.compute.manager [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] self.driver.spawn(context, instance, image_meta, [ 1216.816146] env[68906]: ERROR nova.compute.manager [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1216.816146] env[68906]: ERROR nova.compute.manager [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1216.816146] env[68906]: ERROR nova.compute.manager [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1216.816146] env[68906]: ERROR nova.compute.manager [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] self._fetch_image_if_missing(context, vi) [ 1216.816146] env[68906]: ERROR nova.compute.manager [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1216.816146] env[68906]: ERROR nova.compute.manager [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] image_cache(vi, tmp_image_ds_loc) [ 1216.816146] env[68906]: ERROR nova.compute.manager [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1216.816580] env[68906]: ERROR nova.compute.manager [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] vm_util.copy_virtual_disk( [ 1216.816580] env[68906]: ERROR nova.compute.manager [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1216.816580] env[68906]: ERROR nova.compute.manager [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] session._wait_for_task(vmdk_copy_task) [ 1216.816580] env[68906]: ERROR nova.compute.manager [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1216.816580] env[68906]: ERROR nova.compute.manager [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] return self.wait_for_task(task_ref) [ 1216.816580] env[68906]: ERROR nova.compute.manager [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1216.816580] env[68906]: ERROR nova.compute.manager [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] return evt.wait() [ 1216.816580] env[68906]: ERROR nova.compute.manager [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1216.816580] env[68906]: ERROR nova.compute.manager [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] result = hub.switch() [ 1216.816580] env[68906]: ERROR nova.compute.manager [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1216.816580] env[68906]: ERROR nova.compute.manager [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] return self.greenlet.switch() [ 1216.816580] env[68906]: ERROR nova.compute.manager [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1216.816580] env[68906]: ERROR nova.compute.manager [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] self.f(*self.args, **self.kw) [ 1216.816945] env[68906]: ERROR nova.compute.manager [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1216.816945] env[68906]: ERROR nova.compute.manager [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] raise exceptions.translate_fault(task_info.error) [ 1216.816945] env[68906]: ERROR nova.compute.manager [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1216.816945] env[68906]: ERROR nova.compute.manager [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Faults: ['InvalidArgument'] [ 1216.816945] env[68906]: ERROR nova.compute.manager [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] [ 1216.816945] env[68906]: DEBUG nova.compute.utils [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] VimFaultException {{(pid=68906) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1216.818559] env[68906]: DEBUG nova.compute.manager [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Build of instance 13eebe4e-5984-46c3-bb73-cd783ad45df6 was re-scheduled: A specified parameter was not correct: fileType [ 1216.818559] env[68906]: Faults: ['InvalidArgument'] {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1216.818928] env[68906]: DEBUG nova.compute.manager [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Unplugging VIFs for instance {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1216.819123] env[68906]: DEBUG nova.compute.manager [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1216.819310] env[68906]: DEBUG nova.compute.manager [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1216.819476] env[68906]: DEBUG nova.network.neutron [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1217.221708] env[68906]: DEBUG nova.network.neutron [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1217.234291] env[68906]: INFO nova.compute.manager [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Took 0.41 seconds to deallocate network for instance. [ 1217.342360] env[68906]: INFO nova.scheduler.client.report [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Deleted allocations for instance 13eebe4e-5984-46c3-bb73-cd783ad45df6 [ 1217.366905] env[68906]: DEBUG oslo_concurrency.lockutils [None req-f739b187-6eca-4474-a1be-8f642c15af34 tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Lock "13eebe4e-5984-46c3-bb73-cd783ad45df6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 569.304s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1217.368330] env[68906]: DEBUG oslo_concurrency.lockutils [None req-0c4881c2-db84-41d7-ad4c-edd06f5d686e tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Lock "13eebe4e-5984-46c3-bb73-cd783ad45df6" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 371.009s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1217.370587] env[68906]: DEBUG oslo_concurrency.lockutils [None req-0c4881c2-db84-41d7-ad4c-edd06f5d686e tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Acquiring lock "13eebe4e-5984-46c3-bb73-cd783ad45df6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1217.370918] env[68906]: DEBUG oslo_concurrency.lockutils [None req-0c4881c2-db84-41d7-ad4c-edd06f5d686e tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Lock "13eebe4e-5984-46c3-bb73-cd783ad45df6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1217.371273] env[68906]: DEBUG oslo_concurrency.lockutils [None req-0c4881c2-db84-41d7-ad4c-edd06f5d686e tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Lock "13eebe4e-5984-46c3-bb73-cd783ad45df6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1217.373416] env[68906]: INFO nova.compute.manager [None req-0c4881c2-db84-41d7-ad4c-edd06f5d686e tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Terminating instance [ 1217.375194] env[68906]: DEBUG nova.compute.manager [None req-0c4881c2-db84-41d7-ad4c-edd06f5d686e tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1217.375458] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-0c4881c2-db84-41d7-ad4c-edd06f5d686e tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1217.376051] env[68906]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-05c7d2a3-6639-42ab-8db9-e20bdfc38ccb {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1217.386421] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-993b60bf-e507-463c-9127-2e3cc9f45a8a {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1217.398292] env[68906]: DEBUG nova.compute.manager [None req-67eb7be2-f488-498f-bd04-d2fc16581526 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 19d8683f-32f8-48b1-960a-b91b5f82a815] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1217.419290] env[68906]: WARNING nova.virt.vmwareapi.vmops [None req-0c4881c2-db84-41d7-ad4c-edd06f5d686e tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 13eebe4e-5984-46c3-bb73-cd783ad45df6 could not be found. [ 1217.419642] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-0c4881c2-db84-41d7-ad4c-edd06f5d686e tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1217.419742] env[68906]: INFO nova.compute.manager [None req-0c4881c2-db84-41d7-ad4c-edd06f5d686e tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1217.419985] env[68906]: DEBUG oslo.service.loopingcall [None req-0c4881c2-db84-41d7-ad4c-edd06f5d686e tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1217.420221] env[68906]: DEBUG nova.compute.manager [-] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1217.420764] env[68906]: DEBUG nova.network.neutron [-] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1217.424357] env[68906]: DEBUG nova.compute.manager [None req-67eb7be2-f488-498f-bd04-d2fc16581526 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 19d8683f-32f8-48b1-960a-b91b5f82a815] Instance disappeared before build. {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1217.443948] env[68906]: DEBUG oslo_concurrency.lockutils [None req-67eb7be2-f488-498f-bd04-d2fc16581526 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Lock "19d8683f-32f8-48b1-960a-b91b5f82a815" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 191.009s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1217.452196] env[68906]: DEBUG nova.network.neutron [-] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1217.453633] env[68906]: DEBUG nova.compute.manager [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1217.460553] env[68906]: INFO nova.compute.manager [-] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] Took 0.04 seconds to deallocate network for instance. [ 1217.499833] env[68906]: DEBUG oslo_concurrency.lockutils [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1217.500050] env[68906]: DEBUG oslo_concurrency.lockutils [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1217.501473] env[68906]: INFO nova.compute.claims [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1217.550595] env[68906]: DEBUG oslo_concurrency.lockutils [None req-0c4881c2-db84-41d7-ad4c-edd06f5d686e tempest-ServersAdminTestJSON-1863634977 tempest-ServersAdminTestJSON-1863634977-project-member] Lock "13eebe4e-5984-46c3-bb73-cd783ad45df6" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.182s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1217.551592] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "13eebe4e-5984-46c3-bb73-cd783ad45df6" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 6.600s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1217.551724] env[68906]: INFO nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 13eebe4e-5984-46c3-bb73-cd783ad45df6] During sync_power_state the instance has a pending task (deleting). Skip. [ 1217.551824] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "13eebe4e-5984-46c3-bb73-cd783ad45df6" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1217.832082] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-661e82ab-40ec-4ace-bcd1-5ef12050688d {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1217.839391] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3cb0952-ae71-4460-bcb9-46285cfb52a1 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1217.870568] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-21241fad-5e2e-452c-b42b-c2c2daccbf1c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1217.877603] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2060e856-2baf-4947-8027-e243d84f2fe8 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1217.890591] env[68906]: DEBUG nova.compute.provider_tree [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1217.899322] env[68906]: DEBUG nova.scheduler.client.report [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1217.913208] env[68906]: DEBUG oslo_concurrency.lockutils [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.413s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1217.913680] env[68906]: DEBUG nova.compute.manager [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Start building networks asynchronously for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1217.945511] env[68906]: DEBUG nova.compute.utils [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Using /dev/sd instead of None {{(pid=68906) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1217.947092] env[68906]: DEBUG nova.compute.manager [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Allocating IP information in the background. {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1217.947266] env[68906]: DEBUG nova.network.neutron [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] allocate_for_instance() {{(pid=68906) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1217.956570] env[68906]: DEBUG nova.compute.manager [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Start building block device mappings for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1218.005083] env[68906]: DEBUG nova.policy [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e208107293fd4f82af1f396d43464b69', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '90f212f7916446919081fcdc0527ebb0', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68906) authorize /opt/stack/nova/nova/policy.py:203}} [ 1218.022758] env[68906]: DEBUG nova.compute.manager [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Start spawning the instance on the hypervisor. {{(pid=68906) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1218.051078] env[68906]: DEBUG nova.virt.hardware [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T13:00:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T13:00:23Z,direct_url=,disk_format='vmdk',id=b1400c31-d33b-4e13-944f-4c645e62493e,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='1ae7bf3a375d41c6af5e7536af51ffd1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T13:00:24Z,virtual_size=,visibility=), allow threads: False {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1218.051078] env[68906]: DEBUG nova.virt.hardware [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Flavor limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1218.051078] env[68906]: DEBUG nova.virt.hardware [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Image limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1218.051237] env[68906]: DEBUG nova.virt.hardware [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Flavor pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1218.051237] env[68906]: DEBUG nova.virt.hardware [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Image pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1218.051237] env[68906]: DEBUG nova.virt.hardware [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1218.051237] env[68906]: DEBUG nova.virt.hardware [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1218.051523] env[68906]: DEBUG nova.virt.hardware [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1218.051853] env[68906]: DEBUG nova.virt.hardware [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Got 1 possible topologies {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1218.052179] env[68906]: DEBUG nova.virt.hardware [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1218.052495] env[68906]: DEBUG nova.virt.hardware [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1218.053462] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-55ceb537-24c1-4dfb-a05a-0340513a9443 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1218.061284] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3967e837-1173-4782-9643-4d69a2134f95 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1218.443165] env[68906]: DEBUG nova.network.neutron [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Successfully created port: b41e00b4-3a63-4199-9865-0b1a00b03b8c {{(pid=68906) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1219.324364] env[68906]: DEBUG nova.compute.manager [req-21b51946-71dc-4027-82e9-820a4b09e16a req-ab5eea25-e56f-4fb2-ba0b-4837d23b95c9 service nova] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Received event network-vif-plugged-b41e00b4-3a63-4199-9865-0b1a00b03b8c {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1219.324787] env[68906]: DEBUG oslo_concurrency.lockutils [req-21b51946-71dc-4027-82e9-820a4b09e16a req-ab5eea25-e56f-4fb2-ba0b-4837d23b95c9 service nova] Acquiring lock "1fdb401a-ac25-4418-803c-fc0b2297f2d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1219.324787] env[68906]: DEBUG oslo_concurrency.lockutils [req-21b51946-71dc-4027-82e9-820a4b09e16a req-ab5eea25-e56f-4fb2-ba0b-4837d23b95c9 service nova] Lock "1fdb401a-ac25-4418-803c-fc0b2297f2d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1219.324956] env[68906]: DEBUG oslo_concurrency.lockutils [req-21b51946-71dc-4027-82e9-820a4b09e16a req-ab5eea25-e56f-4fb2-ba0b-4837d23b95c9 service nova] Lock "1fdb401a-ac25-4418-803c-fc0b2297f2d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1219.325224] env[68906]: DEBUG nova.compute.manager [req-21b51946-71dc-4027-82e9-820a4b09e16a req-ab5eea25-e56f-4fb2-ba0b-4837d23b95c9 service nova] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] No waiting events found dispatching network-vif-plugged-b41e00b4-3a63-4199-9865-0b1a00b03b8c {{(pid=68906) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1219.325443] env[68906]: WARNING nova.compute.manager [req-21b51946-71dc-4027-82e9-820a4b09e16a req-ab5eea25-e56f-4fb2-ba0b-4837d23b95c9 service nova] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Received unexpected event network-vif-plugged-b41e00b4-3a63-4199-9865-0b1a00b03b8c for instance with vm_state building and task_state spawning. [ 1219.336038] env[68906]: DEBUG nova.network.neutron [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Successfully updated port: b41e00b4-3a63-4199-9865-0b1a00b03b8c {{(pid=68906) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1219.354552] env[68906]: DEBUG oslo_concurrency.lockutils [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Acquiring lock "refresh_cache-1fdb401a-ac25-4418-803c-fc0b2297f2d4" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1219.354748] env[68906]: DEBUG oslo_concurrency.lockutils [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Acquired lock "refresh_cache-1fdb401a-ac25-4418-803c-fc0b2297f2d4" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1219.354905] env[68906]: DEBUG nova.network.neutron [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Building network info cache for instance {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1219.392949] env[68906]: DEBUG nova.network.neutron [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Instance cache missing network info. {{(pid=68906) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1219.671179] env[68906]: DEBUG nova.network.neutron [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Updating instance_info_cache with network_info: [{"id": "b41e00b4-3a63-4199-9865-0b1a00b03b8c", "address": "fa:16:3e:8e:9d:d8", "network": {"id": "da6ba094-8e2a-4f76-813c-8668f482685b", "bridge": "br-int", "label": "tempest-ServersTestJSON-512380607-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "90f212f7916446919081fcdc0527ebb0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0cd5d325-3053-407e-a4ee-f627e82a23f9", "external-id": "nsx-vlan-transportzone-809", "segmentation_id": 809, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb41e00b4-3a", "ovs_interfaceid": "b41e00b4-3a63-4199-9865-0b1a00b03b8c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1219.685298] env[68906]: DEBUG oslo_concurrency.lockutils [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Releasing lock "refresh_cache-1fdb401a-ac25-4418-803c-fc0b2297f2d4" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1219.685620] env[68906]: DEBUG nova.compute.manager [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Instance network_info: |[{"id": "b41e00b4-3a63-4199-9865-0b1a00b03b8c", "address": "fa:16:3e:8e:9d:d8", "network": {"id": "da6ba094-8e2a-4f76-813c-8668f482685b", "bridge": "br-int", "label": "tempest-ServersTestJSON-512380607-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "90f212f7916446919081fcdc0527ebb0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0cd5d325-3053-407e-a4ee-f627e82a23f9", "external-id": "nsx-vlan-transportzone-809", "segmentation_id": 809, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb41e00b4-3a", "ovs_interfaceid": "b41e00b4-3a63-4199-9865-0b1a00b03b8c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1219.686036] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:8e:9d:d8', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '0cd5d325-3053-407e-a4ee-f627e82a23f9', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'b41e00b4-3a63-4199-9865-0b1a00b03b8c', 'vif_model': 'vmxnet3'}] {{(pid=68906) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1219.693961] env[68906]: DEBUG oslo.service.loopingcall [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1219.694522] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Creating VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1219.694774] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-c3e9efb0-5967-4b22-8116-0be9f714d13f {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1219.716199] env[68906]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1219.716199] env[68906]: value = "task-3475368" [ 1219.716199] env[68906]: _type = "Task" [ 1219.716199] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1219.724553] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475368, 'name': CreateVM_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1220.226704] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475368, 'name': CreateVM_Task, 'duration_secs': 0.28992} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1220.226884] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Created VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1220.227603] env[68906]: DEBUG oslo_concurrency.lockutils [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1220.227742] env[68906]: DEBUG oslo_concurrency.lockutils [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1220.229070] env[68906]: DEBUG oslo_concurrency.lockutils [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1220.229070] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-dd0e0e3e-a946-4802-a9f9-39e836e61690 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1220.232801] env[68906]: DEBUG oslo_vmware.api [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Waiting for the task: (returnval){ [ 1220.232801] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]5206fb31-c00a-fbb8-700b-30ebbb68122e" [ 1220.232801] env[68906]: _type = "Task" [ 1220.232801] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1220.240196] env[68906]: DEBUG oslo_vmware.api [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]5206fb31-c00a-fbb8-700b-30ebbb68122e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1220.744028] env[68906]: DEBUG oslo_concurrency.lockutils [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1220.744028] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Processing image b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1220.744355] env[68906]: DEBUG oslo_concurrency.lockutils [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1221.562059] env[68906]: DEBUG nova.compute.manager [req-66d6e972-18e5-4991-aac5-7e9a295ba16d req-88e96633-f7f3-412f-ba2a-a487883fb4c6 service nova] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Received event network-changed-b41e00b4-3a63-4199-9865-0b1a00b03b8c {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1221.562201] env[68906]: DEBUG nova.compute.manager [req-66d6e972-18e5-4991-aac5-7e9a295ba16d req-88e96633-f7f3-412f-ba2a-a487883fb4c6 service nova] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Refreshing instance network info cache due to event network-changed-b41e00b4-3a63-4199-9865-0b1a00b03b8c. {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1221.562420] env[68906]: DEBUG oslo_concurrency.lockutils [req-66d6e972-18e5-4991-aac5-7e9a295ba16d req-88e96633-f7f3-412f-ba2a-a487883fb4c6 service nova] Acquiring lock "refresh_cache-1fdb401a-ac25-4418-803c-fc0b2297f2d4" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1221.562564] env[68906]: DEBUG oslo_concurrency.lockutils [req-66d6e972-18e5-4991-aac5-7e9a295ba16d req-88e96633-f7f3-412f-ba2a-a487883fb4c6 service nova] Acquired lock "refresh_cache-1fdb401a-ac25-4418-803c-fc0b2297f2d4" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1221.562723] env[68906]: DEBUG nova.network.neutron [req-66d6e972-18e5-4991-aac5-7e9a295ba16d req-88e96633-f7f3-412f-ba2a-a487883fb4c6 service nova] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Refreshing network info cache for port b41e00b4-3a63-4199-9865-0b1a00b03b8c {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1221.943097] env[68906]: DEBUG nova.network.neutron [req-66d6e972-18e5-4991-aac5-7e9a295ba16d req-88e96633-f7f3-412f-ba2a-a487883fb4c6 service nova] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Updated VIF entry in instance network info cache for port b41e00b4-3a63-4199-9865-0b1a00b03b8c. {{(pid=68906) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1221.943483] env[68906]: DEBUG nova.network.neutron [req-66d6e972-18e5-4991-aac5-7e9a295ba16d req-88e96633-f7f3-412f-ba2a-a487883fb4c6 service nova] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Updating instance_info_cache with network_info: [{"id": "b41e00b4-3a63-4199-9865-0b1a00b03b8c", "address": "fa:16:3e:8e:9d:d8", "network": {"id": "da6ba094-8e2a-4f76-813c-8668f482685b", "bridge": "br-int", "label": "tempest-ServersTestJSON-512380607-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "90f212f7916446919081fcdc0527ebb0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0cd5d325-3053-407e-a4ee-f627e82a23f9", "external-id": "nsx-vlan-transportzone-809", "segmentation_id": 809, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb41e00b4-3a", "ovs_interfaceid": "b41e00b4-3a63-4199-9865-0b1a00b03b8c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1221.953352] env[68906]: DEBUG oslo_concurrency.lockutils [req-66d6e972-18e5-4991-aac5-7e9a295ba16d req-88e96633-f7f3-412f-ba2a-a487883fb4c6 service nova] Releasing lock "refresh_cache-1fdb401a-ac25-4418-803c-fc0b2297f2d4" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1225.113937] env[68906]: DEBUG oslo_concurrency.lockutils [None req-b8824427-275b-4d8c-bc13-f6c3124d4cfd tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Acquiring lock "1fdb401a-ac25-4418-803c-fc0b2297f2d4" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1235.176038] env[68906]: DEBUG oslo_concurrency.lockutils [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Acquiring lock "9b884416-df89-4d8c-b2ab-0667db52a718" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1235.176382] env[68906]: DEBUG oslo_concurrency.lockutils [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Lock "9b884416-df89-4d8c-b2ab-0667db52a718" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1235.200592] env[68906]: DEBUG oslo_concurrency.lockutils [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Acquiring lock "917ba3c3-9188-40fa-be6c-cdab27b76970" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1235.200805] env[68906]: DEBUG oslo_concurrency.lockutils [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Lock "917ba3c3-9188-40fa-be6c-cdab27b76970" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1235.229497] env[68906]: DEBUG oslo_concurrency.lockutils [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Acquiring lock "7803f951-a0c0-4246-b2d9-3eabadfa679d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1235.229731] env[68906]: DEBUG oslo_concurrency.lockutils [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Lock "7803f951-a0c0-4246-b2d9-3eabadfa679d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1238.190133] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1239.140698] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1241.140319] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1241.140319] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1241.140319] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68906) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1242.143643] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1242.143643] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Starting heal instance info cache {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1242.143643] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Rebuilding the list of instances to heal {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1242.168736] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1242.168736] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1242.169300] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1242.169592] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: acc11633-a489-4d8f-ad76-f17049a91545] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1242.171088] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1242.171384] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1242.172638] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1242.172638] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1242.172638] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: db011373-7285-4882-8bce-d39cfa22fe80] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1242.172638] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1242.172638] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Didn't find any instances for network info cache update. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1243.166328] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1244.140827] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1245.143018] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1246.140603] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager.update_available_resource {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1246.160259] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1246.160560] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1246.160744] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1246.160895] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68906) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1246.163497] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3004403a-ce3c-4a7f-ad4a-61e72a3e31af {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1246.175243] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eae90733-7143-488c-81fe-b1b0ac85ac11 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1246.196148] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-085d50c8-db1a-4be0-84d3-af9a7e48e567 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1246.204435] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e44e906-410b-4c2c-b03d-59ec259989cc {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1246.238029] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180956MB free_disk=93GB free_vcpus=48 pci_devices=None {{(pid=68906) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1246.238029] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1246.238029] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1246.338064] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance a7e0a28f-42a5-442e-b962-07771d2e6a27 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1246.338064] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance eb81e9b1-b573-4d7c-9ede-f8b32a43a201 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1246.338064] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1246.338064] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance acc11633-a489-4d8f-ad76-f17049a91545 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1246.338278] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance e7286888-d79d-4632-9c06-69c1ef47fa50 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1246.338278] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 641cca5b-d749-4331-a5e0-8acb6d47cba2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1246.338278] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1246.338436] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 4d36bb91-0cde-44cb-8706-d17740a9cf50 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1246.338482] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance db011373-7285-4882-8bce-d39cfa22fe80 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1246.338587] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 1fdb401a-ac25-4418-803c-fc0b2297f2d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1246.351547] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 4a616d87-7b55-4b1f-b938-9d9261e8b2cd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1246.364214] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 38248e62-53b8-402e-aa29-d9a445b0af9d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1246.376926] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 60ba9060-c3c3-4561-b9e9-e2df08e2e38b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1246.390304] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance e8a14af6-ab4f-407e-943d-4dd3a46c8711 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1246.409045] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 57078f52-8070-480e-b8ea-278ef759f0a3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1246.420242] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1246.431518] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance e0e595e3-e47e-4cf1-8977-f004eca942d1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1246.443039] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 3c36e8a4-da45-457e-b4ef-001f4a4e595f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1246.459563] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 582a086e-5122-41f2-8fb8-513b3734eef4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1246.473711] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 159edc16-55bb-46eb-8fa9-7da7c1f36cd0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1246.490371] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 13b471c5-c86e-4b55-a231-159b2219de2f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1246.502667] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance d01b8b11-bc3b-47dc-8687-a111c1453ed9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1246.519787] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 7466df8a-59a9-49b9-bff7-c4efbeae3eee has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1246.531441] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 89171680-c76d-4826-9236-379542661ffb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1246.545993] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 9b884416-df89-4d8c-b2ab-0667db52a718 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1246.559554] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 917ba3c3-9188-40fa-be6c-cdab27b76970 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1246.580067] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 7803f951-a0c0-4246-b2d9-3eabadfa679d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1246.580312] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1246.580462] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1246.985920] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16cdbb07-f8ce-42d3-a4e5-8ab51e5f069b {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1246.994010] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ba48baf-56be-4443-99d3-6f46c0c0a8e1 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1247.026338] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-239d3cd8-8b3a-4079-9574-8b8a40e90d88 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1247.035393] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8164943f-4769-4094-9c92-cdb8b0afb3ca {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1247.049323] env[68906]: DEBUG nova.compute.provider_tree [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1247.059719] env[68906]: DEBUG nova.scheduler.client.report [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1247.074944] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68906) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1247.074944] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.837s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1251.007608] env[68906]: DEBUG oslo_concurrency.lockutils [None req-c44660d0-cc4e-4ff6-b5ef-48f4a756fdda tempest-ServersAdminNegativeTestJSON-1434965427 tempest-ServersAdminNegativeTestJSON-1434965427-project-member] Acquiring lock "8a4e18b6-55c0-4397-b570-27db4541e9b3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1251.007965] env[68906]: DEBUG oslo_concurrency.lockutils [None req-c44660d0-cc4e-4ff6-b5ef-48f4a756fdda tempest-ServersAdminNegativeTestJSON-1434965427 tempest-ServersAdminNegativeTestJSON-1434965427-project-member] Lock "8a4e18b6-55c0-4397-b570-27db4541e9b3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1257.548091] env[68906]: DEBUG oslo_concurrency.lockutils [None req-c9c1fd81-2ddc-492d-b16c-1ea8a5a25b7e tempest-SecurityGroupsTestJSON-973572118 tempest-SecurityGroupsTestJSON-973572118-project-member] Acquiring lock "2b688987-d4cf-4ebb-83c4-d5fa7f5bcbb9" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1257.548390] env[68906]: DEBUG oslo_concurrency.lockutils [None req-c9c1fd81-2ddc-492d-b16c-1ea8a5a25b7e tempest-SecurityGroupsTestJSON-973572118 tempest-SecurityGroupsTestJSON-973572118-project-member] Lock "2b688987-d4cf-4ebb-83c4-d5fa7f5bcbb9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1258.661133] env[68906]: DEBUG oslo_concurrency.lockutils [None req-b6863f75-a96e-4ba0-8871-761a25ce2a13 tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Acquiring lock "3ce59687-c677-40bd-8af4-c2f4b576e86e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1258.661918] env[68906]: DEBUG oslo_concurrency.lockutils [None req-b6863f75-a96e-4ba0-8871-761a25ce2a13 tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Lock "3ce59687-c677-40bd-8af4-c2f4b576e86e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1263.098086] env[68906]: DEBUG oslo_concurrency.lockutils [None req-826cb636-5a60-4614-876a-92b085b28a4c tempest-ServerPasswordTestJSON-295808980 tempest-ServerPasswordTestJSON-295808980-project-member] Acquiring lock "45c0d7ba-6d21-46d1-8bcb-0318bd93f885" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1263.098365] env[68906]: DEBUG oslo_concurrency.lockutils [None req-826cb636-5a60-4614-876a-92b085b28a4c tempest-ServerPasswordTestJSON-295808980 tempest-ServerPasswordTestJSON-295808980-project-member] Lock "45c0d7ba-6d21-46d1-8bcb-0318bd93f885" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1265.688953] env[68906]: WARNING oslo_vmware.rw_handles [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1265.688953] env[68906]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1265.688953] env[68906]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1265.688953] env[68906]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1265.688953] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1265.688953] env[68906]: ERROR oslo_vmware.rw_handles response.begin() [ 1265.688953] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1265.688953] env[68906]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1265.688953] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1265.688953] env[68906]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1265.688953] env[68906]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1265.688953] env[68906]: ERROR oslo_vmware.rw_handles [ 1265.689673] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Downloaded image file data b1400c31-d33b-4e13-944f-4c645e62493e to vmware_temp/0adcb25d-1777-4154-9875-507d27045746/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1265.691534] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Caching image {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1265.691793] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Copying Virtual Disk [datastore2] vmware_temp/0adcb25d-1777-4154-9875-507d27045746/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk to [datastore2] vmware_temp/0adcb25d-1777-4154-9875-507d27045746/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk {{(pid=68906) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1265.692126] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-66d475ea-a40c-4982-bc62-5df1701d9946 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1265.700189] env[68906]: DEBUG oslo_vmware.api [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Waiting for the task: (returnval){ [ 1265.700189] env[68906]: value = "task-3475369" [ 1265.700189] env[68906]: _type = "Task" [ 1265.700189] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1265.707999] env[68906]: DEBUG oslo_vmware.api [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Task: {'id': task-3475369, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1266.210983] env[68906]: DEBUG oslo_vmware.exceptions [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Fault InvalidArgument not matched. {{(pid=68906) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1266.211287] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1266.211829] env[68906]: ERROR nova.compute.manager [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1266.211829] env[68906]: Faults: ['InvalidArgument'] [ 1266.211829] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Traceback (most recent call last): [ 1266.211829] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1266.211829] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] yield resources [ 1266.211829] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1266.211829] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] self.driver.spawn(context, instance, image_meta, [ 1266.211829] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1266.211829] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1266.211829] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1266.211829] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] self._fetch_image_if_missing(context, vi) [ 1266.211829] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1266.212256] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] image_cache(vi, tmp_image_ds_loc) [ 1266.212256] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1266.212256] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] vm_util.copy_virtual_disk( [ 1266.212256] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1266.212256] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] session._wait_for_task(vmdk_copy_task) [ 1266.212256] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1266.212256] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] return self.wait_for_task(task_ref) [ 1266.212256] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1266.212256] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] return evt.wait() [ 1266.212256] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1266.212256] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] result = hub.switch() [ 1266.212256] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1266.212256] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] return self.greenlet.switch() [ 1266.212675] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1266.212675] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] self.f(*self.args, **self.kw) [ 1266.212675] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1266.212675] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] raise exceptions.translate_fault(task_info.error) [ 1266.212675] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1266.212675] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Faults: ['InvalidArgument'] [ 1266.212675] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] [ 1266.212675] env[68906]: INFO nova.compute.manager [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Terminating instance [ 1266.213718] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1266.213927] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1266.214685] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2ffd69c5-78bc-44f9-ac19-a9a91de6fd83 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1266.217013] env[68906]: DEBUG nova.compute.manager [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1266.217223] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1266.217921] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b703f0a6-8e0c-4f1a-b9b3-e49465090af6 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1266.224589] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Unregistering the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1266.224798] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-7e2b27a0-542d-43b3-b47e-c59dca1d4a84 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1266.226952] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1266.227141] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68906) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1266.228232] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-91fdf2e4-5e85-4af9-a03f-f01f76bf4709 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1266.233183] env[68906]: DEBUG oslo_vmware.api [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Waiting for the task: (returnval){ [ 1266.233183] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]5261a7ab-0a16-93fe-a317-ae3bf07823cc" [ 1266.233183] env[68906]: _type = "Task" [ 1266.233183] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1266.239942] env[68906]: DEBUG oslo_vmware.api [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]5261a7ab-0a16-93fe-a317-ae3bf07823cc, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1266.315061] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Unregistered the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1266.315302] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Deleting contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1266.315483] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Deleting the datastore file [datastore2] a7e0a28f-42a5-442e-b962-07771d2e6a27 {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1266.315742] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-26d7d3ba-2d69-46d1-b325-4e78cc7b78e3 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1266.322028] env[68906]: DEBUG oslo_vmware.api [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Waiting for the task: (returnval){ [ 1266.322028] env[68906]: value = "task-3475371" [ 1266.322028] env[68906]: _type = "Task" [ 1266.322028] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1266.329742] env[68906]: DEBUG oslo_vmware.api [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Task: {'id': task-3475371, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1266.743562] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Preparing fetch location {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1266.743827] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Creating directory with path [datastore2] vmware_temp/61b84d49-76b5-4703-9bbc-0b3e82161d64/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1266.744056] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-8a24630c-2283-42fb-a141-92dc6e8299cc {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1266.755114] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Created directory with path [datastore2] vmware_temp/61b84d49-76b5-4703-9bbc-0b3e82161d64/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1266.755329] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Fetch image to [datastore2] vmware_temp/61b84d49-76b5-4703-9bbc-0b3e82161d64/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1266.755501] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to [datastore2] vmware_temp/61b84d49-76b5-4703-9bbc-0b3e82161d64/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1266.756266] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0147912-dfa3-42e4-81a8-d14f12c08f75 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1266.763148] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-471780c6-107a-476e-bb89-f433fab423a7 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1266.773971] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-07b1023e-040c-4abe-9d04-2589c3710d60 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1266.806018] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f337308e-c262-482f-a7ec-9e66699f7294 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1266.810936] env[68906]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-6559d5db-c400-4571-8a7e-0300d399d236 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1266.831363] env[68906]: DEBUG oslo_vmware.api [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Task: {'id': task-3475371, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.068759} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1266.831689] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Deleted the datastore file {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1266.831887] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Deleted contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1266.832096] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1266.832312] env[68906]: INFO nova.compute.manager [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Took 0.62 seconds to destroy the instance on the hypervisor. [ 1266.834441] env[68906]: DEBUG nova.compute.claims [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Aborting claim: {{(pid=68906) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1266.834663] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1266.834911] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1266.838830] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1267.074905] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1267.076609] env[68906]: ERROR nova.compute.manager [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image b1400c31-d33b-4e13-944f-4c645e62493e. [ 1267.076609] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Traceback (most recent call last): [ 1267.076609] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1267.076609] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1267.076609] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1267.076609] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] result = getattr(controller, method)(*args, **kwargs) [ 1267.076609] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1267.076609] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] return self._get(image_id) [ 1267.076609] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1267.076609] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1267.076609] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1267.077149] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] resp, body = self.http_client.get(url, headers=header) [ 1267.077149] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1267.077149] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] return self.request(url, 'GET', **kwargs) [ 1267.077149] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1267.077149] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] return self._handle_response(resp) [ 1267.077149] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1267.077149] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] raise exc.from_response(resp, resp.content) [ 1267.077149] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1267.077149] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] [ 1267.077149] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] During handling of the above exception, another exception occurred: [ 1267.077149] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] [ 1267.077149] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Traceback (most recent call last): [ 1267.077782] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1267.077782] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] yield resources [ 1267.077782] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1267.077782] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] self.driver.spawn(context, instance, image_meta, [ 1267.077782] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1267.077782] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1267.077782] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1267.077782] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] self._fetch_image_if_missing(context, vi) [ 1267.077782] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1267.077782] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] image_fetch(context, vi, tmp_image_ds_loc) [ 1267.077782] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1267.077782] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] images.fetch_image( [ 1267.077782] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1267.078552] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] metadata = IMAGE_API.get(context, image_ref) [ 1267.078552] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1267.078552] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] return session.show(context, image_id, [ 1267.078552] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1267.078552] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] _reraise_translated_image_exception(image_id) [ 1267.078552] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1267.078552] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] raise new_exc.with_traceback(exc_trace) [ 1267.078552] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1267.078552] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1267.078552] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1267.078552] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] result = getattr(controller, method)(*args, **kwargs) [ 1267.078552] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1267.078552] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] return self._get(image_id) [ 1267.079224] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1267.079224] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1267.079224] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1267.079224] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] resp, body = self.http_client.get(url, headers=header) [ 1267.079224] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1267.079224] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] return self.request(url, 'GET', **kwargs) [ 1267.079224] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1267.079224] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] return self._handle_response(resp) [ 1267.079224] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1267.079224] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] raise exc.from_response(resp, resp.content) [ 1267.079224] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] nova.exception.ImageNotAuthorized: Not authorized for image b1400c31-d33b-4e13-944f-4c645e62493e. [ 1267.079224] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] [ 1267.080425] env[68906]: INFO nova.compute.manager [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Terminating instance [ 1267.080425] env[68906]: DEBUG oslo_concurrency.lockutils [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1267.080425] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1267.080425] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Acquiring lock "refresh_cache-eb81e9b1-b573-4d7c-9ede-f8b32a43a201" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1267.080425] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Acquired lock "refresh_cache-eb81e9b1-b573-4d7c-9ede-f8b32a43a201" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1267.080591] env[68906]: DEBUG nova.network.neutron [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Building network info cache for instance {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1267.080591] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-93caf7d6-aa3f-4aa3-a077-9335555c7ebd {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1267.093388] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1267.093583] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68906) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1267.094645] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9bcccb10-ca5d-4c07-a0c5-61c72ae23459 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1267.100702] env[68906]: DEBUG oslo_vmware.api [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Waiting for the task: (returnval){ [ 1267.100702] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]52bbd966-ed51-186e-e580-39ce7620b581" [ 1267.100702] env[68906]: _type = "Task" [ 1267.100702] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1267.111143] env[68906]: DEBUG oslo_vmware.api [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]52bbd966-ed51-186e-e580-39ce7620b581, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1267.154891] env[68906]: DEBUG nova.network.neutron [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Instance cache missing network info. {{(pid=68906) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1267.239909] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e25833c5-5ac3-4751-b02f-693fabe5df75 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1267.247533] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e25ad219-5f04-4e0d-a241-e9d2f5226053 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1267.281339] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4a2c0e61-1c95-488b-8ab5-9669625c4615 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1267.284376] env[68906]: DEBUG nova.network.neutron [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1267.290627] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1842e118-37cf-41fe-9370-c3f243c72df9 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1267.297931] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Releasing lock "refresh_cache-eb81e9b1-b573-4d7c-9ede-f8b32a43a201" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1267.298803] env[68906]: DEBUG nova.compute.manager [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1267.298895] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1267.308538] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-12869719-9019-428c-9812-46d972eb6ae5 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1267.311341] env[68906]: DEBUG nova.compute.provider_tree [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1267.316274] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Unregistering the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1267.316513] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-a2e29993-b680-4759-8fc9-d841d7f59d2f {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1267.320134] env[68906]: DEBUG nova.scheduler.client.report [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1267.337148] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.502s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1267.337698] env[68906]: ERROR nova.compute.manager [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1267.337698] env[68906]: Faults: ['InvalidArgument'] [ 1267.337698] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Traceback (most recent call last): [ 1267.337698] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1267.337698] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] self.driver.spawn(context, instance, image_meta, [ 1267.337698] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1267.337698] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1267.337698] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1267.337698] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] self._fetch_image_if_missing(context, vi) [ 1267.337698] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1267.337698] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] image_cache(vi, tmp_image_ds_loc) [ 1267.337698] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1267.338100] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] vm_util.copy_virtual_disk( [ 1267.338100] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1267.338100] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] session._wait_for_task(vmdk_copy_task) [ 1267.338100] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1267.338100] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] return self.wait_for_task(task_ref) [ 1267.338100] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1267.338100] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] return evt.wait() [ 1267.338100] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1267.338100] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] result = hub.switch() [ 1267.338100] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1267.338100] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] return self.greenlet.switch() [ 1267.338100] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1267.338100] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] self.f(*self.args, **self.kw) [ 1267.338557] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1267.338557] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] raise exceptions.translate_fault(task_info.error) [ 1267.338557] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1267.338557] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Faults: ['InvalidArgument'] [ 1267.338557] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] [ 1267.338557] env[68906]: DEBUG nova.compute.utils [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] VimFaultException {{(pid=68906) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1267.339945] env[68906]: DEBUG nova.compute.manager [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Build of instance a7e0a28f-42a5-442e-b962-07771d2e6a27 was re-scheduled: A specified parameter was not correct: fileType [ 1267.339945] env[68906]: Faults: ['InvalidArgument'] {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1267.340325] env[68906]: DEBUG nova.compute.manager [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Unplugging VIFs for instance {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1267.340494] env[68906]: DEBUG nova.compute.manager [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1267.340647] env[68906]: DEBUG nova.compute.manager [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1267.340810] env[68906]: DEBUG nova.network.neutron [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1267.344030] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Unregistered the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1267.344152] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Deleting contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1267.344314] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Deleting the datastore file [datastore2] eb81e9b1-b573-4d7c-9ede-f8b32a43a201 {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1267.344543] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-0d93b0a5-f4d0-4ddc-aaf4-6cdbde4acd17 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1267.350625] env[68906]: DEBUG oslo_vmware.api [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Waiting for the task: (returnval){ [ 1267.350625] env[68906]: value = "task-3475373" [ 1267.350625] env[68906]: _type = "Task" [ 1267.350625] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1267.358244] env[68906]: DEBUG oslo_vmware.api [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Task: {'id': task-3475373, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1267.461309] env[68906]: DEBUG neutronclient.v2_0.client [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=68906) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1267.463558] env[68906]: ERROR nova.compute.manager [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1267.463558] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Traceback (most recent call last): [ 1267.463558] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1267.463558] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] self.driver.spawn(context, instance, image_meta, [ 1267.463558] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1267.463558] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1267.463558] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1267.463558] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] self._fetch_image_if_missing(context, vi) [ 1267.463558] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1267.463558] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] image_cache(vi, tmp_image_ds_loc) [ 1267.463558] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1267.463558] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] vm_util.copy_virtual_disk( [ 1267.463898] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1267.463898] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] session._wait_for_task(vmdk_copy_task) [ 1267.463898] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1267.463898] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] return self.wait_for_task(task_ref) [ 1267.463898] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1267.463898] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] return evt.wait() [ 1267.463898] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1267.463898] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] result = hub.switch() [ 1267.463898] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1267.463898] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] return self.greenlet.switch() [ 1267.463898] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1267.463898] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] self.f(*self.args, **self.kw) [ 1267.463898] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1267.464262] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] raise exceptions.translate_fault(task_info.error) [ 1267.464262] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1267.464262] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Faults: ['InvalidArgument'] [ 1267.464262] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] [ 1267.464262] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] During handling of the above exception, another exception occurred: [ 1267.464262] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] [ 1267.464262] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Traceback (most recent call last): [ 1267.464262] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/nova/nova/compute/manager.py", line 2430, in _do_build_and_run_instance [ 1267.464262] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] self._build_and_run_instance(context, instance, image, [ 1267.464262] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/nova/nova/compute/manager.py", line 2722, in _build_and_run_instance [ 1267.464262] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] raise exception.RescheduledException( [ 1267.464262] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] nova.exception.RescheduledException: Build of instance a7e0a28f-42a5-442e-b962-07771d2e6a27 was re-scheduled: A specified parameter was not correct: fileType [ 1267.464262] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Faults: ['InvalidArgument'] [ 1267.464262] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] [ 1267.464674] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] During handling of the above exception, another exception occurred: [ 1267.464674] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] [ 1267.464674] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Traceback (most recent call last): [ 1267.464674] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1267.464674] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] ret = obj(*args, **kwargs) [ 1267.464674] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1267.464674] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] exception_handler_v20(status_code, error_body) [ 1267.464674] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1267.464674] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] raise client_exc(message=error_message, [ 1267.464674] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1267.464674] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Neutron server returns request_ids: ['req-75a6cb76-dade-4df5-be53-2b74bcf9209c'] [ 1267.464674] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] [ 1267.464674] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] During handling of the above exception, another exception occurred: [ 1267.465097] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] [ 1267.465097] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Traceback (most recent call last): [ 1267.465097] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/nova/nova/compute/manager.py", line 3019, in _cleanup_allocated_networks [ 1267.465097] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] self._deallocate_network(context, instance, requested_networks) [ 1267.465097] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1267.465097] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] self.network_api.deallocate_for_instance( [ 1267.465097] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1267.465097] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] data = neutron.list_ports(**search_opts) [ 1267.465097] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1267.465097] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] ret = obj(*args, **kwargs) [ 1267.465097] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1267.465097] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] return self.list('ports', self.ports_path, retrieve_all, [ 1267.465097] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1267.465497] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] ret = obj(*args, **kwargs) [ 1267.465497] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1267.465497] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] for r in self._pagination(collection, path, **params): [ 1267.465497] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1267.465497] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] res = self.get(path, params=params) [ 1267.465497] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1267.465497] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] ret = obj(*args, **kwargs) [ 1267.465497] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1267.465497] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] return self.retry_request("GET", action, body=body, [ 1267.465497] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1267.465497] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] ret = obj(*args, **kwargs) [ 1267.465497] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1267.465497] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] return self.do_request(method, action, body=body, [ 1267.465944] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1267.465944] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] ret = obj(*args, **kwargs) [ 1267.465944] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1267.465944] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] self._handle_fault_response(status_code, replybody, resp) [ 1267.465944] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1267.465944] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] raise exception.Unauthorized() [ 1267.465944] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] nova.exception.Unauthorized: Not authorized. [ 1267.465944] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] [ 1267.533347] env[68906]: INFO nova.scheduler.client.report [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Deleted allocations for instance a7e0a28f-42a5-442e-b962-07771d2e6a27 [ 1267.551526] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6d6efef2-b262-4393-93df-3cbea2feabcc tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Lock "a7e0a28f-42a5-442e-b962-07771d2e6a27" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 615.207s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1267.552602] env[68906]: DEBUG oslo_concurrency.lockutils [None req-da55a776-c1b9-49fd-b283-78bee6c2271a tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Lock "a7e0a28f-42a5-442e-b962-07771d2e6a27" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 418.306s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1267.552827] env[68906]: DEBUG oslo_concurrency.lockutils [None req-da55a776-c1b9-49fd-b283-78bee6c2271a tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Acquiring lock "a7e0a28f-42a5-442e-b962-07771d2e6a27-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1267.553049] env[68906]: DEBUG oslo_concurrency.lockutils [None req-da55a776-c1b9-49fd-b283-78bee6c2271a tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Lock "a7e0a28f-42a5-442e-b962-07771d2e6a27-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1267.553233] env[68906]: DEBUG oslo_concurrency.lockutils [None req-da55a776-c1b9-49fd-b283-78bee6c2271a tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Lock "a7e0a28f-42a5-442e-b962-07771d2e6a27-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1267.558068] env[68906]: INFO nova.compute.manager [None req-da55a776-c1b9-49fd-b283-78bee6c2271a tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Terminating instance [ 1267.558068] env[68906]: DEBUG nova.compute.manager [None req-da55a776-c1b9-49fd-b283-78bee6c2271a tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1267.558068] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-da55a776-c1b9-49fd-b283-78bee6c2271a tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1267.558068] env[68906]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-5105a8a3-5775-41d2-9cb3-f03deafc6a14 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1267.568040] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b674a14-99b2-436e-9384-6728abfffabf {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1267.578126] env[68906]: DEBUG nova.compute.manager [None req-f74f9aac-5e8a-4885-aea7-d641298084da tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] [instance: d6be39b6-8bbc-4657-9ceb-9a4110c29c53] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1267.598403] env[68906]: WARNING nova.virt.vmwareapi.vmops [None req-da55a776-c1b9-49fd-b283-78bee6c2271a tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance a7e0a28f-42a5-442e-b962-07771d2e6a27 could not be found. [ 1267.598632] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-da55a776-c1b9-49fd-b283-78bee6c2271a tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1267.598811] env[68906]: INFO nova.compute.manager [None req-da55a776-c1b9-49fd-b283-78bee6c2271a tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1267.599621] env[68906]: DEBUG oslo.service.loopingcall [None req-da55a776-c1b9-49fd-b283-78bee6c2271a tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1267.599621] env[68906]: DEBUG nova.compute.manager [-] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1267.599621] env[68906]: DEBUG nova.network.neutron [-] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1267.602742] env[68906]: DEBUG nova.compute.manager [None req-f74f9aac-5e8a-4885-aea7-d641298084da tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] [instance: d6be39b6-8bbc-4657-9ceb-9a4110c29c53] Instance disappeared before build. {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1267.612434] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Preparing fetch location {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1267.612671] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Creating directory with path [datastore2] vmware_temp/2b63fbd5-de43-447a-b5c2-d0ff342edee0/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1267.612883] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-48aa06a2-c31f-4c9e-b82b-3cd19e81e900 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1267.624662] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Created directory with path [datastore2] vmware_temp/2b63fbd5-de43-447a-b5c2-d0ff342edee0/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1267.624662] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Fetch image to [datastore2] vmware_temp/2b63fbd5-de43-447a-b5c2-d0ff342edee0/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1267.624662] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to [datastore2] vmware_temp/2b63fbd5-de43-447a-b5c2-d0ff342edee0/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1267.625850] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c02794a4-fca7-4ba3-857b-5ff9a7db01b7 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1267.629121] env[68906]: DEBUG oslo_concurrency.lockutils [None req-f74f9aac-5e8a-4885-aea7-d641298084da tempest-VolumesAdminNegativeTest-1259132687 tempest-VolumesAdminNegativeTest-1259132687-project-member] Lock "d6be39b6-8bbc-4657-9ceb-9a4110c29c53" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 225.131s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1267.635107] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ead0bed5-fcd0-40f4-ba70-31c8eb232aec {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1267.639799] env[68906]: DEBUG nova.compute.manager [None req-dab1d23a-194f-4c67-9fbb-9bf4e98100d4 tempest-ServerDiagnosticsNegativeTest-1250564378 tempest-ServerDiagnosticsNegativeTest-1250564378-project-member] [instance: 4a616d87-7b55-4b1f-b938-9d9261e8b2cd] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1267.651966] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c393ecc-6303-450e-9483-4ad7f21d5efb {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1267.687905] env[68906]: DEBUG nova.compute.manager [None req-dab1d23a-194f-4c67-9fbb-9bf4e98100d4 tempest-ServerDiagnosticsNegativeTest-1250564378 tempest-ServerDiagnosticsNegativeTest-1250564378-project-member] [instance: 4a616d87-7b55-4b1f-b938-9d9261e8b2cd] Instance disappeared before build. {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1267.688857] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35ea0ad1-6277-408d-a015-98f6db8dc6c9 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1267.695606] env[68906]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-51ef5aac-9090-4ada-be97-4c6a53ae9e44 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1267.715955] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1267.718152] env[68906]: DEBUG oslo_concurrency.lockutils [None req-dab1d23a-194f-4c67-9fbb-9bf4e98100d4 tempest-ServerDiagnosticsNegativeTest-1250564378 tempest-ServerDiagnosticsNegativeTest-1250564378-project-member] Lock "4a616d87-7b55-4b1f-b938-9d9261e8b2cd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 215.147s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1267.729477] env[68906]: DEBUG nova.compute.manager [None req-af46b527-dfe4-45d6-8cce-779746bfe2e9 tempest-ServersV294TestFqdnHostnames-2119071120 tempest-ServersV294TestFqdnHostnames-2119071120-project-member] [instance: 38248e62-53b8-402e-aa29-d9a445b0af9d] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1267.761713] env[68906]: DEBUG neutronclient.v2_0.client [-] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=68906) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1267.761978] env[68906]: ERROR nova.network.neutron [-] Neutron client was not able to generate a valid admin token, please verify Neutron admin credential located in nova.conf: neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1267.762700] env[68906]: ERROR oslo.service.loopingcall [-] Dynamic interval looping call 'oslo_service.loopingcall.RetryDecorator.__call__.._func' failed: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1267.762700] env[68906]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1267.762700] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1267.762700] env[68906]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1267.762700] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1267.762700] env[68906]: ERROR oslo.service.loopingcall exception_handler_v20(status_code, error_body) [ 1267.762700] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1267.762700] env[68906]: ERROR oslo.service.loopingcall raise client_exc(message=error_message, [ 1267.762700] env[68906]: ERROR oslo.service.loopingcall neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1267.762700] env[68906]: ERROR oslo.service.loopingcall Neutron server returns request_ids: ['req-12f39442-b15f-4722-91a6-82f479cb55e4'] [ 1267.762700] env[68906]: ERROR oslo.service.loopingcall [ 1267.762700] env[68906]: ERROR oslo.service.loopingcall During handling of the above exception, another exception occurred: [ 1267.762700] env[68906]: ERROR oslo.service.loopingcall [ 1267.762700] env[68906]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1267.762700] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1267.762700] env[68906]: ERROR oslo.service.loopingcall result = func(*self.args, **self.kw) [ 1267.763241] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1267.763241] env[68906]: ERROR oslo.service.loopingcall result = f(*args, **kwargs) [ 1267.763241] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1267.763241] env[68906]: ERROR oslo.service.loopingcall self._deallocate_network( [ 1267.763241] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1267.763241] env[68906]: ERROR oslo.service.loopingcall self.network_api.deallocate_for_instance( [ 1267.763241] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1267.763241] env[68906]: ERROR oslo.service.loopingcall data = neutron.list_ports(**search_opts) [ 1267.763241] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1267.763241] env[68906]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1267.763241] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1267.763241] env[68906]: ERROR oslo.service.loopingcall return self.list('ports', self.ports_path, retrieve_all, [ 1267.763241] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1267.763241] env[68906]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1267.763241] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1267.763241] env[68906]: ERROR oslo.service.loopingcall for r in self._pagination(collection, path, **params): [ 1267.763241] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1267.763241] env[68906]: ERROR oslo.service.loopingcall res = self.get(path, params=params) [ 1267.763805] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1267.763805] env[68906]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1267.763805] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1267.763805] env[68906]: ERROR oslo.service.loopingcall return self.retry_request("GET", action, body=body, [ 1267.763805] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1267.763805] env[68906]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1267.763805] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1267.763805] env[68906]: ERROR oslo.service.loopingcall return self.do_request(method, action, body=body, [ 1267.763805] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1267.763805] env[68906]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1267.763805] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1267.763805] env[68906]: ERROR oslo.service.loopingcall self._handle_fault_response(status_code, replybody, resp) [ 1267.763805] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1267.763805] env[68906]: ERROR oslo.service.loopingcall raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1267.763805] env[68906]: ERROR oslo.service.loopingcall nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1267.763805] env[68906]: ERROR oslo.service.loopingcall [ 1267.764328] env[68906]: ERROR nova.compute.manager [None req-da55a776-c1b9-49fd-b283-78bee6c2271a tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Failed to deallocate network for instance. Error: Networking client is experiencing an unauthorized exception.: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1267.766255] env[68906]: DEBUG nova.compute.manager [None req-af46b527-dfe4-45d6-8cce-779746bfe2e9 tempest-ServersV294TestFqdnHostnames-2119071120 tempest-ServersV294TestFqdnHostnames-2119071120-project-member] [instance: 38248e62-53b8-402e-aa29-d9a445b0af9d] Instance disappeared before build. {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1267.790966] env[68906]: DEBUG oslo_concurrency.lockutils [None req-af46b527-dfe4-45d6-8cce-779746bfe2e9 tempest-ServersV294TestFqdnHostnames-2119071120 tempest-ServersV294TestFqdnHostnames-2119071120-project-member] Lock "38248e62-53b8-402e-aa29-d9a445b0af9d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 214.651s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1267.798018] env[68906]: ERROR nova.compute.manager [None req-da55a776-c1b9-49fd-b283-78bee6c2271a tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Setting instance vm_state to ERROR: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1267.798018] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Traceback (most recent call last): [ 1267.798018] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1267.798018] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] ret = obj(*args, **kwargs) [ 1267.798018] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1267.798018] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] exception_handler_v20(status_code, error_body) [ 1267.798018] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1267.798018] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] raise client_exc(message=error_message, [ 1267.798018] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1267.798018] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Neutron server returns request_ids: ['req-12f39442-b15f-4722-91a6-82f479cb55e4'] [ 1267.798018] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] [ 1267.798526] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] During handling of the above exception, another exception occurred: [ 1267.798526] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] [ 1267.798526] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Traceback (most recent call last): [ 1267.798526] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 1267.798526] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] self._delete_instance(context, instance, bdms) [ 1267.798526] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 1267.798526] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] self._shutdown_instance(context, instance, bdms) [ 1267.798526] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 1267.798526] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] self._try_deallocate_network(context, instance, requested_networks) [ 1267.798526] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 1267.798526] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] with excutils.save_and_reraise_exception(): [ 1267.798526] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1267.798526] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] self.force_reraise() [ 1267.798910] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1267.798910] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] raise self.value [ 1267.798910] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 1267.798910] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] _deallocate_network_with_retries() [ 1267.798910] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1267.798910] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] return evt.wait() [ 1267.798910] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1267.798910] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] result = hub.switch() [ 1267.798910] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1267.798910] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] return self.greenlet.switch() [ 1267.798910] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1267.798910] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] result = func(*self.args, **self.kw) [ 1267.799280] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1267.799280] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] result = f(*args, **kwargs) [ 1267.799280] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1267.799280] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] self._deallocate_network( [ 1267.799280] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1267.799280] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] self.network_api.deallocate_for_instance( [ 1267.799280] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1267.799280] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] data = neutron.list_ports(**search_opts) [ 1267.799280] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1267.799280] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] ret = obj(*args, **kwargs) [ 1267.799280] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1267.799280] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] return self.list('ports', self.ports_path, retrieve_all, [ 1267.799280] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1267.799661] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] ret = obj(*args, **kwargs) [ 1267.799661] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1267.799661] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] for r in self._pagination(collection, path, **params): [ 1267.799661] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1267.799661] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] res = self.get(path, params=params) [ 1267.799661] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1267.799661] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] ret = obj(*args, **kwargs) [ 1267.799661] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1267.799661] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] return self.retry_request("GET", action, body=body, [ 1267.799661] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1267.799661] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] ret = obj(*args, **kwargs) [ 1267.799661] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1267.799661] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] return self.do_request(method, action, body=body, [ 1267.800026] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1267.800026] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] ret = obj(*args, **kwargs) [ 1267.800026] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1267.800026] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] self._handle_fault_response(status_code, replybody, resp) [ 1267.800026] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1267.800026] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1267.800026] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1267.800026] env[68906]: ERROR nova.compute.manager [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] [ 1267.801841] env[68906]: DEBUG nova.compute.manager [None req-b7a9f4c8-e833-4e68-ad14-0b824c77959e tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: 60ba9060-c3c3-4561-b9e9-e2df08e2e38b] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1267.804523] env[68906]: DEBUG oslo_vmware.rw_handles [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2b63fbd5-de43-447a-b5c2-d0ff342edee0/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1267.861743] env[68906]: DEBUG nova.compute.manager [None req-b7a9f4c8-e833-4e68-ad14-0b824c77959e tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: 60ba9060-c3c3-4561-b9e9-e2df08e2e38b] Instance disappeared before build. {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1267.863737] env[68906]: DEBUG oslo_concurrency.lockutils [None req-da55a776-c1b9-49fd-b283-78bee6c2271a tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Lock "a7e0a28f-42a5-442e-b962-07771d2e6a27" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.311s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1267.868903] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "a7e0a28f-42a5-442e-b962-07771d2e6a27" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 56.917s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1267.869127] env[68906]: INFO nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] During sync_power_state the instance has a pending task (deleting). Skip. [ 1267.869314] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "a7e0a28f-42a5-442e-b962-07771d2e6a27" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1267.869760] env[68906]: DEBUG oslo_vmware.rw_handles [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Completed reading data from the image iterator. {{(pid=68906) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1267.869916] env[68906]: DEBUG oslo_vmware.rw_handles [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2b63fbd5-de43-447a-b5c2-d0ff342edee0/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1267.875201] env[68906]: DEBUG oslo_vmware.api [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Task: {'id': task-3475373, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.041753} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1267.876144] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Deleted the datastore file {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1267.876351] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Deleted contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1267.876560] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1267.876736] env[68906]: INFO nova.compute.manager [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Took 0.58 seconds to destroy the instance on the hypervisor. [ 1267.876971] env[68906]: DEBUG oslo.service.loopingcall [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1267.880706] env[68906]: DEBUG nova.compute.manager [-] [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Skipping network deallocation for instance since networking was not requested. {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1267.883614] env[68906]: DEBUG nova.compute.claims [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Aborting claim: {{(pid=68906) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1267.884158] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1267.884389] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.001s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1267.895026] env[68906]: DEBUG oslo_concurrency.lockutils [None req-b7a9f4c8-e833-4e68-ad14-0b824c77959e tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Lock "60ba9060-c3c3-4561-b9e9-e2df08e2e38b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 207.506s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1267.901394] env[68906]: DEBUG nova.compute.manager [None req-d9b622cd-9243-4719-bcad-1768eb655752 tempest-MultipleCreateTestJSON-422056473 tempest-MultipleCreateTestJSON-422056473-project-member] [instance: e8a14af6-ab4f-407e-943d-4dd3a46c8711] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1267.952954] env[68906]: INFO nova.compute.manager [None req-da55a776-c1b9-49fd-b283-78bee6c2271a tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] [instance: a7e0a28f-42a5-442e-b962-07771d2e6a27] Successfully reverted task state from None on failure for instance. [ 1267.957514] env[68906]: ERROR oslo_messaging.rpc.server [None req-da55a776-c1b9-49fd-b283-78bee6c2271a tempest-TenantUsagesTestJSON-2001711929 tempest-TenantUsagesTestJSON-2001711929-project-member] Exception during message handling: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1267.957514] env[68906]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1267.957514] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1267.957514] env[68906]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1267.957514] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1267.957514] env[68906]: ERROR oslo_messaging.rpc.server exception_handler_v20(status_code, error_body) [ 1267.957514] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1267.957514] env[68906]: ERROR oslo_messaging.rpc.server raise client_exc(message=error_message, [ 1267.957514] env[68906]: ERROR oslo_messaging.rpc.server neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1267.957514] env[68906]: ERROR oslo_messaging.rpc.server Neutron server returns request_ids: ['req-12f39442-b15f-4722-91a6-82f479cb55e4'] [ 1267.957514] env[68906]: ERROR oslo_messaging.rpc.server [ 1267.957514] env[68906]: ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred: [ 1267.957514] env[68906]: ERROR oslo_messaging.rpc.server [ 1267.957514] env[68906]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1267.957514] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming [ 1267.957514] env[68906]: ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) [ 1267.958306] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch [ 1267.958306] env[68906]: ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) [ 1267.958306] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch [ 1267.958306] env[68906]: ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) [ 1267.958306] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 65, in wrapped [ 1267.958306] env[68906]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1267.958306] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1267.958306] env[68906]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1267.958306] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1267.958306] env[68906]: ERROR oslo_messaging.rpc.server raise self.value [ 1267.958306] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 63, in wrapped [ 1267.958306] env[68906]: ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) [ 1267.958306] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 166, in decorated_function [ 1267.958306] env[68906]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1267.958306] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1267.958306] env[68906]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1267.958306] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1267.958306] env[68906]: ERROR oslo_messaging.rpc.server raise self.value [ 1267.959292] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 157, in decorated_function [ 1267.959292] env[68906]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1267.959292] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/utils.py", line 1439, in decorated_function [ 1267.959292] env[68906]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1267.959292] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 213, in decorated_function [ 1267.959292] env[68906]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1267.959292] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1267.959292] env[68906]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1267.959292] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1267.959292] env[68906]: ERROR oslo_messaging.rpc.server raise self.value [ 1267.959292] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 203, in decorated_function [ 1267.959292] env[68906]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1267.959292] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3327, in terminate_instance [ 1267.959292] env[68906]: ERROR oslo_messaging.rpc.server do_terminate_instance(instance, bdms) [ 1267.959292] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1267.959292] env[68906]: ERROR oslo_messaging.rpc.server return f(*args, **kwargs) [ 1267.959292] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3322, in do_terminate_instance [ 1267.959292] env[68906]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1267.960152] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1267.960152] env[68906]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1267.960152] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1267.960152] env[68906]: ERROR oslo_messaging.rpc.server raise self.value [ 1267.960152] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 1267.960152] env[68906]: ERROR oslo_messaging.rpc.server self._delete_instance(context, instance, bdms) [ 1267.960152] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 1267.960152] env[68906]: ERROR oslo_messaging.rpc.server self._shutdown_instance(context, instance, bdms) [ 1267.960152] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 1267.960152] env[68906]: ERROR oslo_messaging.rpc.server self._try_deallocate_network(context, instance, requested_networks) [ 1267.960152] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 1267.960152] env[68906]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1267.960152] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1267.960152] env[68906]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1267.960152] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1267.960152] env[68906]: ERROR oslo_messaging.rpc.server raise self.value [ 1267.960152] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 1267.960152] env[68906]: ERROR oslo_messaging.rpc.server _deallocate_network_with_retries() [ 1267.960959] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1267.960959] env[68906]: ERROR oslo_messaging.rpc.server return evt.wait() [ 1267.960959] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1267.960959] env[68906]: ERROR oslo_messaging.rpc.server result = hub.switch() [ 1267.960959] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1267.960959] env[68906]: ERROR oslo_messaging.rpc.server return self.greenlet.switch() [ 1267.960959] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1267.960959] env[68906]: ERROR oslo_messaging.rpc.server result = func(*self.args, **self.kw) [ 1267.960959] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1267.960959] env[68906]: ERROR oslo_messaging.rpc.server result = f(*args, **kwargs) [ 1267.960959] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1267.960959] env[68906]: ERROR oslo_messaging.rpc.server self._deallocate_network( [ 1267.960959] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1267.960959] env[68906]: ERROR oslo_messaging.rpc.server self.network_api.deallocate_for_instance( [ 1267.960959] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1267.960959] env[68906]: ERROR oslo_messaging.rpc.server data = neutron.list_ports(**search_opts) [ 1267.960959] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1267.960959] env[68906]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1267.961889] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1267.961889] env[68906]: ERROR oslo_messaging.rpc.server return self.list('ports', self.ports_path, retrieve_all, [ 1267.961889] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1267.961889] env[68906]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1267.961889] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1267.961889] env[68906]: ERROR oslo_messaging.rpc.server for r in self._pagination(collection, path, **params): [ 1267.961889] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1267.961889] env[68906]: ERROR oslo_messaging.rpc.server res = self.get(path, params=params) [ 1267.961889] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1267.961889] env[68906]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1267.961889] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1267.961889] env[68906]: ERROR oslo_messaging.rpc.server return self.retry_request("GET", action, body=body, [ 1267.961889] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1267.961889] env[68906]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1267.961889] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1267.961889] env[68906]: ERROR oslo_messaging.rpc.server return self.do_request(method, action, body=body, [ 1267.961889] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1267.961889] env[68906]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1267.962829] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1267.962829] env[68906]: ERROR oslo_messaging.rpc.server self._handle_fault_response(status_code, replybody, resp) [ 1267.962829] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1267.962829] env[68906]: ERROR oslo_messaging.rpc.server raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1267.962829] env[68906]: ERROR oslo_messaging.rpc.server nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1267.962829] env[68906]: ERROR oslo_messaging.rpc.server [ 1267.965903] env[68906]: DEBUG oslo_concurrency.lockutils [None req-d9b622cd-9243-4719-bcad-1768eb655752 tempest-MultipleCreateTestJSON-422056473 tempest-MultipleCreateTestJSON-422056473-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1268.256707] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8063e11f-0833-4796-b440-28af274aeca1 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1268.264394] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dc8d83ea-cc10-4e10-b873-a76097088760 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1268.293817] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c7c00b76-69d7-47b4-9d93-f27f6168da2b {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1268.300937] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ecfd2ce-04a5-4736-8033-37ff1e8d8254 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1268.315422] env[68906]: DEBUG nova.compute.provider_tree [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1268.324090] env[68906]: DEBUG nova.scheduler.client.report [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1268.338150] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.454s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1268.338936] env[68906]: ERROR nova.compute.manager [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Failed to build and run instance: nova.exception.ImageNotAuthorized: Not authorized for image b1400c31-d33b-4e13-944f-4c645e62493e. [ 1268.338936] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Traceback (most recent call last): [ 1268.338936] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1268.338936] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1268.338936] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1268.338936] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] result = getattr(controller, method)(*args, **kwargs) [ 1268.338936] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1268.338936] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] return self._get(image_id) [ 1268.338936] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1268.338936] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1268.338936] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1268.339354] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] resp, body = self.http_client.get(url, headers=header) [ 1268.339354] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1268.339354] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] return self.request(url, 'GET', **kwargs) [ 1268.339354] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1268.339354] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] return self._handle_response(resp) [ 1268.339354] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1268.339354] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] raise exc.from_response(resp, resp.content) [ 1268.339354] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1268.339354] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] [ 1268.339354] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] During handling of the above exception, another exception occurred: [ 1268.339354] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] [ 1268.339354] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Traceback (most recent call last): [ 1268.339731] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1268.339731] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] self.driver.spawn(context, instance, image_meta, [ 1268.339731] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1268.339731] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1268.339731] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1268.339731] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] self._fetch_image_if_missing(context, vi) [ 1268.339731] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1268.339731] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] image_fetch(context, vi, tmp_image_ds_loc) [ 1268.339731] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1268.339731] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] images.fetch_image( [ 1268.339731] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1268.339731] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] metadata = IMAGE_API.get(context, image_ref) [ 1268.339731] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1268.340146] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] return session.show(context, image_id, [ 1268.340146] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1268.340146] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] _reraise_translated_image_exception(image_id) [ 1268.340146] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1268.340146] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] raise new_exc.with_traceback(exc_trace) [ 1268.340146] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1268.340146] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1268.340146] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1268.340146] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] result = getattr(controller, method)(*args, **kwargs) [ 1268.340146] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1268.340146] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] return self._get(image_id) [ 1268.340146] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1268.340146] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1268.340541] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1268.340541] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] resp, body = self.http_client.get(url, headers=header) [ 1268.340541] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1268.340541] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] return self.request(url, 'GET', **kwargs) [ 1268.340541] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1268.340541] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] return self._handle_response(resp) [ 1268.340541] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1268.340541] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] raise exc.from_response(resp, resp.content) [ 1268.340541] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] nova.exception.ImageNotAuthorized: Not authorized for image b1400c31-d33b-4e13-944f-4c645e62493e. [ 1268.340541] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] [ 1268.340867] env[68906]: DEBUG nova.compute.utils [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Not authorized for image b1400c31-d33b-4e13-944f-4c645e62493e. {{(pid=68906) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1268.340867] env[68906]: DEBUG oslo_concurrency.lockutils [None req-d9b622cd-9243-4719-bcad-1768eb655752 tempest-MultipleCreateTestJSON-422056473 tempest-MultipleCreateTestJSON-422056473-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.375s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1268.342127] env[68906]: INFO nova.compute.claims [None req-d9b622cd-9243-4719-bcad-1768eb655752 tempest-MultipleCreateTestJSON-422056473 tempest-MultipleCreateTestJSON-422056473-project-member] [instance: e8a14af6-ab4f-407e-943d-4dd3a46c8711] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1268.345353] env[68906]: DEBUG nova.compute.manager [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Build of instance eb81e9b1-b573-4d7c-9ede-f8b32a43a201 was re-scheduled: Not authorized for image b1400c31-d33b-4e13-944f-4c645e62493e. {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1268.345583] env[68906]: DEBUG nova.compute.manager [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Unplugging VIFs for instance {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1268.345995] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Acquiring lock "refresh_cache-eb81e9b1-b573-4d7c-9ede-f8b32a43a201" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1268.346067] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Acquired lock "refresh_cache-eb81e9b1-b573-4d7c-9ede-f8b32a43a201" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1268.346235] env[68906]: DEBUG nova.network.neutron [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Building network info cache for instance {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1268.379143] env[68906]: DEBUG oslo_concurrency.lockutils [None req-d9b622cd-9243-4719-bcad-1768eb655752 tempest-MultipleCreateTestJSON-422056473 tempest-MultipleCreateTestJSON-422056473-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.038s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1268.379924] env[68906]: DEBUG nova.compute.utils [None req-d9b622cd-9243-4719-bcad-1768eb655752 tempest-MultipleCreateTestJSON-422056473 tempest-MultipleCreateTestJSON-422056473-project-member] [instance: e8a14af6-ab4f-407e-943d-4dd3a46c8711] Instance e8a14af6-ab4f-407e-943d-4dd3a46c8711 could not be found. {{(pid=68906) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1268.381330] env[68906]: DEBUG nova.compute.manager [None req-d9b622cd-9243-4719-bcad-1768eb655752 tempest-MultipleCreateTestJSON-422056473 tempest-MultipleCreateTestJSON-422056473-project-member] [instance: e8a14af6-ab4f-407e-943d-4dd3a46c8711] Instance disappeared during build. {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2487}} [ 1268.381496] env[68906]: DEBUG nova.compute.manager [None req-d9b622cd-9243-4719-bcad-1768eb655752 tempest-MultipleCreateTestJSON-422056473 tempest-MultipleCreateTestJSON-422056473-project-member] [instance: e8a14af6-ab4f-407e-943d-4dd3a46c8711] Unplugging VIFs for instance {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1268.381712] env[68906]: DEBUG oslo_concurrency.lockutils [None req-d9b622cd-9243-4719-bcad-1768eb655752 tempest-MultipleCreateTestJSON-422056473 tempest-MultipleCreateTestJSON-422056473-project-member] Acquiring lock "refresh_cache-e8a14af6-ab4f-407e-943d-4dd3a46c8711" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1268.381861] env[68906]: DEBUG oslo_concurrency.lockutils [None req-d9b622cd-9243-4719-bcad-1768eb655752 tempest-MultipleCreateTestJSON-422056473 tempest-MultipleCreateTestJSON-422056473-project-member] Acquired lock "refresh_cache-e8a14af6-ab4f-407e-943d-4dd3a46c8711" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1268.382030] env[68906]: DEBUG nova.network.neutron [None req-d9b622cd-9243-4719-bcad-1768eb655752 tempest-MultipleCreateTestJSON-422056473 tempest-MultipleCreateTestJSON-422056473-project-member] [instance: e8a14af6-ab4f-407e-943d-4dd3a46c8711] Building network info cache for instance {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1268.387864] env[68906]: DEBUG nova.network.neutron [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Instance cache missing network info. {{(pid=68906) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1268.390781] env[68906]: DEBUG nova.compute.utils [None req-d9b622cd-9243-4719-bcad-1768eb655752 tempest-MultipleCreateTestJSON-422056473 tempest-MultipleCreateTestJSON-422056473-project-member] [instance: e8a14af6-ab4f-407e-943d-4dd3a46c8711] Can not refresh info_cache because instance was not found {{(pid=68906) refresh_info_cache_for_instance /opt/stack/nova/nova/compute/utils.py:1010}} [ 1268.409255] env[68906]: DEBUG nova.network.neutron [None req-d9b622cd-9243-4719-bcad-1768eb655752 tempest-MultipleCreateTestJSON-422056473 tempest-MultipleCreateTestJSON-422056473-project-member] [instance: e8a14af6-ab4f-407e-943d-4dd3a46c8711] Instance cache missing network info. {{(pid=68906) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1268.517319] env[68906]: DEBUG nova.network.neutron [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1268.527044] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Releasing lock "refresh_cache-eb81e9b1-b573-4d7c-9ede-f8b32a43a201" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1268.527206] env[68906]: DEBUG nova.compute.manager [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1268.527393] env[68906]: DEBUG nova.compute.manager [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Skipping network deallocation for instance since networking was not requested. {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1268.574131] env[68906]: DEBUG nova.network.neutron [None req-d9b622cd-9243-4719-bcad-1768eb655752 tempest-MultipleCreateTestJSON-422056473 tempest-MultipleCreateTestJSON-422056473-project-member] [instance: e8a14af6-ab4f-407e-943d-4dd3a46c8711] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1268.586049] env[68906]: DEBUG oslo_concurrency.lockutils [None req-d9b622cd-9243-4719-bcad-1768eb655752 tempest-MultipleCreateTestJSON-422056473 tempest-MultipleCreateTestJSON-422056473-project-member] Releasing lock "refresh_cache-e8a14af6-ab4f-407e-943d-4dd3a46c8711" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1268.586287] env[68906]: DEBUG nova.compute.manager [None req-d9b622cd-9243-4719-bcad-1768eb655752 tempest-MultipleCreateTestJSON-422056473 tempest-MultipleCreateTestJSON-422056473-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1268.587766] env[68906]: DEBUG nova.compute.manager [None req-d9b622cd-9243-4719-bcad-1768eb655752 tempest-MultipleCreateTestJSON-422056473 tempest-MultipleCreateTestJSON-422056473-project-member] [instance: e8a14af6-ab4f-407e-943d-4dd3a46c8711] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1268.587962] env[68906]: DEBUG nova.network.neutron [None req-d9b622cd-9243-4719-bcad-1768eb655752 tempest-MultipleCreateTestJSON-422056473 tempest-MultipleCreateTestJSON-422056473-project-member] [instance: e8a14af6-ab4f-407e-943d-4dd3a46c8711] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1268.609643] env[68906]: DEBUG nova.network.neutron [None req-d9b622cd-9243-4719-bcad-1768eb655752 tempest-MultipleCreateTestJSON-422056473 tempest-MultipleCreateTestJSON-422056473-project-member] [instance: e8a14af6-ab4f-407e-943d-4dd3a46c8711] Instance cache missing network info. {{(pid=68906) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1268.618902] env[68906]: DEBUG nova.network.neutron [None req-d9b622cd-9243-4719-bcad-1768eb655752 tempest-MultipleCreateTestJSON-422056473 tempest-MultipleCreateTestJSON-422056473-project-member] [instance: e8a14af6-ab4f-407e-943d-4dd3a46c8711] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1268.640227] env[68906]: INFO nova.compute.manager [None req-d9b622cd-9243-4719-bcad-1768eb655752 tempest-MultipleCreateTestJSON-422056473 tempest-MultipleCreateTestJSON-422056473-project-member] [instance: e8a14af6-ab4f-407e-943d-4dd3a46c8711] Took 0.05 seconds to deallocate network for instance. [ 1268.646565] env[68906]: INFO nova.scheduler.client.report [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Deleted allocations for instance eb81e9b1-b573-4d7c-9ede-f8b32a43a201 [ 1268.669352] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bce01e0c-e35b-4a41-8089-b5ea423b267e tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Lock "eb81e9b1-b573-4d7c-9ede-f8b32a43a201" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 613.512s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1268.670583] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8f8aa184-d11f-40ed-b40e-e09ca2f6ed86 tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Lock "eb81e9b1-b573-4d7c-9ede-f8b32a43a201" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 415.620s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1268.670851] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8f8aa184-d11f-40ed-b40e-e09ca2f6ed86 tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Acquiring lock "eb81e9b1-b573-4d7c-9ede-f8b32a43a201-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1268.671106] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8f8aa184-d11f-40ed-b40e-e09ca2f6ed86 tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Lock "eb81e9b1-b573-4d7c-9ede-f8b32a43a201-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1268.671277] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8f8aa184-d11f-40ed-b40e-e09ca2f6ed86 tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Lock "eb81e9b1-b573-4d7c-9ede-f8b32a43a201-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1268.674394] env[68906]: INFO nova.compute.manager [None req-8f8aa184-d11f-40ed-b40e-e09ca2f6ed86 tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Terminating instance [ 1268.676323] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8f8aa184-d11f-40ed-b40e-e09ca2f6ed86 tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Acquiring lock "refresh_cache-eb81e9b1-b573-4d7c-9ede-f8b32a43a201" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1268.676531] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8f8aa184-d11f-40ed-b40e-e09ca2f6ed86 tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Acquired lock "refresh_cache-eb81e9b1-b573-4d7c-9ede-f8b32a43a201" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1268.676793] env[68906]: DEBUG nova.network.neutron [None req-8f8aa184-d11f-40ed-b40e-e09ca2f6ed86 tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Building network info cache for instance {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1268.679441] env[68906]: DEBUG nova.compute.manager [None req-d9b622cd-9243-4719-bcad-1768eb655752 tempest-MultipleCreateTestJSON-422056473 tempest-MultipleCreateTestJSON-422056473-project-member] [instance: 57078f52-8070-480e-b8ea-278ef759f0a3] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1268.704363] env[68906]: DEBUG nova.compute.manager [None req-d9b622cd-9243-4719-bcad-1768eb655752 tempest-MultipleCreateTestJSON-422056473 tempest-MultipleCreateTestJSON-422056473-project-member] [instance: 57078f52-8070-480e-b8ea-278ef759f0a3] Instance disappeared before build. {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1268.721306] env[68906]: DEBUG oslo_concurrency.lockutils [None req-d9b622cd-9243-4719-bcad-1768eb655752 tempest-MultipleCreateTestJSON-422056473 tempest-MultipleCreateTestJSON-422056473-project-member] Lock "e8a14af6-ab4f-407e-943d-4dd3a46c8711" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 197.252s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1268.733781] env[68906]: DEBUG nova.network.neutron [None req-8f8aa184-d11f-40ed-b40e-e09ca2f6ed86 tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Instance cache missing network info. {{(pid=68906) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1268.741099] env[68906]: DEBUG oslo_concurrency.lockutils [None req-d9b622cd-9243-4719-bcad-1768eb655752 tempest-MultipleCreateTestJSON-422056473 tempest-MultipleCreateTestJSON-422056473-project-member] Lock "57078f52-8070-480e-b8ea-278ef759f0a3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 197.241s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1268.742792] env[68906]: DEBUG nova.compute.manager [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1268.753045] env[68906]: DEBUG nova.compute.manager [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1268.800941] env[68906]: DEBUG nova.network.neutron [None req-8f8aa184-d11f-40ed-b40e-e09ca2f6ed86 tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1268.809753] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8f8aa184-d11f-40ed-b40e-e09ca2f6ed86 tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Releasing lock "refresh_cache-eb81e9b1-b573-4d7c-9ede-f8b32a43a201" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1268.810325] env[68906]: DEBUG nova.compute.manager [None req-8f8aa184-d11f-40ed-b40e-e09ca2f6ed86 tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1268.810930] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-8f8aa184-d11f-40ed-b40e-e09ca2f6ed86 tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1268.811889] env[68906]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-6daf9f17-3892-4dc6-a963-a75cd032be01 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1268.823398] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ff2a45c4-c92d-499d-9d36-c7b3e76bcfab {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1268.862501] env[68906]: WARNING nova.virt.vmwareapi.vmops [None req-8f8aa184-d11f-40ed-b40e-e09ca2f6ed86 tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance eb81e9b1-b573-4d7c-9ede-f8b32a43a201 could not be found. [ 1268.862501] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-8f8aa184-d11f-40ed-b40e-e09ca2f6ed86 tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1268.862501] env[68906]: INFO nova.compute.manager [None req-8f8aa184-d11f-40ed-b40e-e09ca2f6ed86 tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1268.862501] env[68906]: DEBUG oslo.service.loopingcall [None req-8f8aa184-d11f-40ed-b40e-e09ca2f6ed86 tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1268.863013] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1268.863210] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1268.865579] env[68906]: INFO nova.compute.claims [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1268.867786] env[68906]: DEBUG nova.compute.manager [-] [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1268.868521] env[68906]: DEBUG nova.network.neutron [-] [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1268.870496] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1269.000195] env[68906]: DEBUG neutronclient.v2_0.client [-] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=68906) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1269.000480] env[68906]: ERROR nova.network.neutron [-] Neutron client was not able to generate a valid admin token, please verify Neutron admin credential located in nova.conf: neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1269.001044] env[68906]: ERROR oslo.service.loopingcall [-] Dynamic interval looping call 'oslo_service.loopingcall.RetryDecorator.__call__.._func' failed: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1269.001044] env[68906]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1269.001044] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1269.001044] env[68906]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1269.001044] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1269.001044] env[68906]: ERROR oslo.service.loopingcall exception_handler_v20(status_code, error_body) [ 1269.001044] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1269.001044] env[68906]: ERROR oslo.service.loopingcall raise client_exc(message=error_message, [ 1269.001044] env[68906]: ERROR oslo.service.loopingcall neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1269.001044] env[68906]: ERROR oslo.service.loopingcall Neutron server returns request_ids: ['req-9b0eb5b6-690a-4095-b214-ceef247efeb6'] [ 1269.001044] env[68906]: ERROR oslo.service.loopingcall [ 1269.001044] env[68906]: ERROR oslo.service.loopingcall During handling of the above exception, another exception occurred: [ 1269.001044] env[68906]: ERROR oslo.service.loopingcall [ 1269.001044] env[68906]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1269.001044] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1269.001044] env[68906]: ERROR oslo.service.loopingcall result = func(*self.args, **self.kw) [ 1269.001596] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1269.001596] env[68906]: ERROR oslo.service.loopingcall result = f(*args, **kwargs) [ 1269.001596] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1269.001596] env[68906]: ERROR oslo.service.loopingcall self._deallocate_network( [ 1269.001596] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1269.001596] env[68906]: ERROR oslo.service.loopingcall self.network_api.deallocate_for_instance( [ 1269.001596] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1269.001596] env[68906]: ERROR oslo.service.loopingcall data = neutron.list_ports(**search_opts) [ 1269.001596] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1269.001596] env[68906]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1269.001596] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1269.001596] env[68906]: ERROR oslo.service.loopingcall return self.list('ports', self.ports_path, retrieve_all, [ 1269.001596] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1269.001596] env[68906]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1269.001596] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1269.001596] env[68906]: ERROR oslo.service.loopingcall for r in self._pagination(collection, path, **params): [ 1269.001596] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1269.001596] env[68906]: ERROR oslo.service.loopingcall res = self.get(path, params=params) [ 1269.002160] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1269.002160] env[68906]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1269.002160] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1269.002160] env[68906]: ERROR oslo.service.loopingcall return self.retry_request("GET", action, body=body, [ 1269.002160] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1269.002160] env[68906]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1269.002160] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1269.002160] env[68906]: ERROR oslo.service.loopingcall return self.do_request(method, action, body=body, [ 1269.002160] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1269.002160] env[68906]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1269.002160] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1269.002160] env[68906]: ERROR oslo.service.loopingcall self._handle_fault_response(status_code, replybody, resp) [ 1269.002160] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1269.002160] env[68906]: ERROR oslo.service.loopingcall raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1269.002160] env[68906]: ERROR oslo.service.loopingcall nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1269.002160] env[68906]: ERROR oslo.service.loopingcall [ 1269.002666] env[68906]: ERROR nova.compute.manager [None req-8f8aa184-d11f-40ed-b40e-e09ca2f6ed86 tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Failed to deallocate network for instance. Error: Networking client is experiencing an unauthorized exception.: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1269.037455] env[68906]: ERROR nova.compute.manager [None req-8f8aa184-d11f-40ed-b40e-e09ca2f6ed86 tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Setting instance vm_state to ERROR: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1269.037455] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Traceback (most recent call last): [ 1269.037455] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1269.037455] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] ret = obj(*args, **kwargs) [ 1269.037455] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1269.037455] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] exception_handler_v20(status_code, error_body) [ 1269.037455] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1269.037455] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] raise client_exc(message=error_message, [ 1269.037455] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1269.037455] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Neutron server returns request_ids: ['req-9b0eb5b6-690a-4095-b214-ceef247efeb6'] [ 1269.037455] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] [ 1269.037941] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] During handling of the above exception, another exception occurred: [ 1269.037941] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] [ 1269.037941] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Traceback (most recent call last): [ 1269.037941] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 1269.037941] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] self._delete_instance(context, instance, bdms) [ 1269.037941] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 1269.037941] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] self._shutdown_instance(context, instance, bdms) [ 1269.037941] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 1269.037941] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] self._try_deallocate_network(context, instance, requested_networks) [ 1269.037941] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 1269.037941] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] with excutils.save_and_reraise_exception(): [ 1269.037941] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1269.037941] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] self.force_reraise() [ 1269.038441] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1269.038441] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] raise self.value [ 1269.038441] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 1269.038441] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] _deallocate_network_with_retries() [ 1269.038441] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1269.038441] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] return evt.wait() [ 1269.038441] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1269.038441] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] result = hub.switch() [ 1269.038441] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1269.038441] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] return self.greenlet.switch() [ 1269.038441] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1269.038441] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] result = func(*self.args, **self.kw) [ 1269.038787] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1269.038787] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] result = f(*args, **kwargs) [ 1269.038787] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1269.038787] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] self._deallocate_network( [ 1269.038787] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1269.038787] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] self.network_api.deallocate_for_instance( [ 1269.038787] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1269.038787] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] data = neutron.list_ports(**search_opts) [ 1269.038787] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1269.038787] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] ret = obj(*args, **kwargs) [ 1269.038787] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1269.038787] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] return self.list('ports', self.ports_path, retrieve_all, [ 1269.038787] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1269.039167] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] ret = obj(*args, **kwargs) [ 1269.039167] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1269.039167] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] for r in self._pagination(collection, path, **params): [ 1269.039167] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1269.039167] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] res = self.get(path, params=params) [ 1269.039167] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1269.039167] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] ret = obj(*args, **kwargs) [ 1269.039167] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1269.039167] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] return self.retry_request("GET", action, body=body, [ 1269.039167] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1269.039167] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] ret = obj(*args, **kwargs) [ 1269.039167] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1269.039167] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] return self.do_request(method, action, body=body, [ 1269.039536] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1269.039536] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] ret = obj(*args, **kwargs) [ 1269.039536] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1269.039536] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] self._handle_fault_response(status_code, replybody, resp) [ 1269.039536] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1269.039536] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1269.039536] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1269.039536] env[68906]: ERROR nova.compute.manager [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] [ 1269.082918] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8f8aa184-d11f-40ed-b40e-e09ca2f6ed86 tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Lock "eb81e9b1-b573-4d7c-9ede-f8b32a43a201" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.412s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1269.084451] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "eb81e9b1-b573-4d7c-9ede-f8b32a43a201" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 58.133s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1269.084752] env[68906]: INFO nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] During sync_power_state the instance has a pending task (deleting). Skip. [ 1269.084993] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "eb81e9b1-b573-4d7c-9ede-f8b32a43a201" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1269.132482] env[68906]: DEBUG oslo_concurrency.lockutils [None req-e8da611a-e036-4ed5-9519-7f53ab98e63d tempest-MultipleCreateTestJSON-422056473 tempest-MultipleCreateTestJSON-422056473-project-member] Acquiring lock "75a4f8bc-09aa-4c9b-b705-fb84ddcf60ea" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1269.133346] env[68906]: DEBUG oslo_concurrency.lockutils [None req-e8da611a-e036-4ed5-9519-7f53ab98e63d tempest-MultipleCreateTestJSON-422056473 tempest-MultipleCreateTestJSON-422056473-project-member] Lock "75a4f8bc-09aa-4c9b-b705-fb84ddcf60ea" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1269.151272] env[68906]: INFO nova.compute.manager [None req-8f8aa184-d11f-40ed-b40e-e09ca2f6ed86 tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] [instance: eb81e9b1-b573-4d7c-9ede-f8b32a43a201] Successfully reverted task state from None on failure for instance. [ 1269.162923] env[68906]: ERROR oslo_messaging.rpc.server [None req-8f8aa184-d11f-40ed-b40e-e09ca2f6ed86 tempest-ServerDiagnosticsV248Test-570687926 tempest-ServerDiagnosticsV248Test-570687926-project-member] Exception during message handling: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1269.162923] env[68906]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1269.162923] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1269.162923] env[68906]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1269.162923] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1269.162923] env[68906]: ERROR oslo_messaging.rpc.server exception_handler_v20(status_code, error_body) [ 1269.162923] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1269.162923] env[68906]: ERROR oslo_messaging.rpc.server raise client_exc(message=error_message, [ 1269.162923] env[68906]: ERROR oslo_messaging.rpc.server neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1269.162923] env[68906]: ERROR oslo_messaging.rpc.server Neutron server returns request_ids: ['req-9b0eb5b6-690a-4095-b214-ceef247efeb6'] [ 1269.162923] env[68906]: ERROR oslo_messaging.rpc.server [ 1269.162923] env[68906]: ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred: [ 1269.162923] env[68906]: ERROR oslo_messaging.rpc.server [ 1269.162923] env[68906]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1269.162923] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming [ 1269.163715] env[68906]: ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) [ 1269.163715] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch [ 1269.163715] env[68906]: ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) [ 1269.163715] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch [ 1269.163715] env[68906]: ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) [ 1269.163715] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 65, in wrapped [ 1269.163715] env[68906]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1269.163715] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1269.163715] env[68906]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1269.163715] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1269.163715] env[68906]: ERROR oslo_messaging.rpc.server raise self.value [ 1269.163715] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 63, in wrapped [ 1269.163715] env[68906]: ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) [ 1269.163715] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 166, in decorated_function [ 1269.163715] env[68906]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1269.163715] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1269.163715] env[68906]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1269.163715] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1269.164650] env[68906]: ERROR oslo_messaging.rpc.server raise self.value [ 1269.164650] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 157, in decorated_function [ 1269.164650] env[68906]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1269.164650] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/utils.py", line 1439, in decorated_function [ 1269.164650] env[68906]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1269.164650] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 213, in decorated_function [ 1269.164650] env[68906]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1269.164650] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1269.164650] env[68906]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1269.164650] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1269.164650] env[68906]: ERROR oslo_messaging.rpc.server raise self.value [ 1269.164650] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 203, in decorated_function [ 1269.164650] env[68906]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1269.164650] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3327, in terminate_instance [ 1269.164650] env[68906]: ERROR oslo_messaging.rpc.server do_terminate_instance(instance, bdms) [ 1269.164650] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1269.164650] env[68906]: ERROR oslo_messaging.rpc.server return f(*args, **kwargs) [ 1269.164650] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3322, in do_terminate_instance [ 1269.165449] env[68906]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1269.165449] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1269.165449] env[68906]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1269.165449] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1269.165449] env[68906]: ERROR oslo_messaging.rpc.server raise self.value [ 1269.165449] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 1269.165449] env[68906]: ERROR oslo_messaging.rpc.server self._delete_instance(context, instance, bdms) [ 1269.165449] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 1269.165449] env[68906]: ERROR oslo_messaging.rpc.server self._shutdown_instance(context, instance, bdms) [ 1269.165449] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 1269.165449] env[68906]: ERROR oslo_messaging.rpc.server self._try_deallocate_network(context, instance, requested_networks) [ 1269.165449] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 1269.165449] env[68906]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1269.165449] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1269.165449] env[68906]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1269.165449] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1269.165449] env[68906]: ERROR oslo_messaging.rpc.server raise self.value [ 1269.165449] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 1269.166249] env[68906]: ERROR oslo_messaging.rpc.server _deallocate_network_with_retries() [ 1269.166249] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1269.166249] env[68906]: ERROR oslo_messaging.rpc.server return evt.wait() [ 1269.166249] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1269.166249] env[68906]: ERROR oslo_messaging.rpc.server result = hub.switch() [ 1269.166249] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1269.166249] env[68906]: ERROR oslo_messaging.rpc.server return self.greenlet.switch() [ 1269.166249] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1269.166249] env[68906]: ERROR oslo_messaging.rpc.server result = func(*self.args, **self.kw) [ 1269.166249] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1269.166249] env[68906]: ERROR oslo_messaging.rpc.server result = f(*args, **kwargs) [ 1269.166249] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1269.166249] env[68906]: ERROR oslo_messaging.rpc.server self._deallocate_network( [ 1269.166249] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1269.166249] env[68906]: ERROR oslo_messaging.rpc.server self.network_api.deallocate_for_instance( [ 1269.166249] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1269.166249] env[68906]: ERROR oslo_messaging.rpc.server data = neutron.list_ports(**search_opts) [ 1269.166249] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1269.167274] env[68906]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1269.167274] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1269.167274] env[68906]: ERROR oslo_messaging.rpc.server return self.list('ports', self.ports_path, retrieve_all, [ 1269.167274] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1269.167274] env[68906]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1269.167274] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1269.167274] env[68906]: ERROR oslo_messaging.rpc.server for r in self._pagination(collection, path, **params): [ 1269.167274] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1269.167274] env[68906]: ERROR oslo_messaging.rpc.server res = self.get(path, params=params) [ 1269.167274] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1269.167274] env[68906]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1269.167274] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1269.167274] env[68906]: ERROR oslo_messaging.rpc.server return self.retry_request("GET", action, body=body, [ 1269.167274] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1269.167274] env[68906]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1269.167274] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1269.167274] env[68906]: ERROR oslo_messaging.rpc.server return self.do_request(method, action, body=body, [ 1269.167274] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1269.167953] env[68906]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1269.167953] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1269.167953] env[68906]: ERROR oslo_messaging.rpc.server self._handle_fault_response(status_code, replybody, resp) [ 1269.167953] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1269.167953] env[68906]: ERROR oslo_messaging.rpc.server raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1269.167953] env[68906]: ERROR oslo_messaging.rpc.server nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1269.167953] env[68906]: ERROR oslo_messaging.rpc.server [ 1269.167953] env[68906]: DEBUG oslo_concurrency.lockutils [None req-e8da611a-e036-4ed5-9519-7f53ab98e63d tempest-MultipleCreateTestJSON-422056473 tempest-MultipleCreateTestJSON-422056473-project-member] Acquiring lock "ff99f1e3-9a4a-487e-afcb-6d8439a0491d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1269.167953] env[68906]: DEBUG oslo_concurrency.lockutils [None req-e8da611a-e036-4ed5-9519-7f53ab98e63d tempest-MultipleCreateTestJSON-422056473 tempest-MultipleCreateTestJSON-422056473-project-member] Lock "ff99f1e3-9a4a-487e-afcb-6d8439a0491d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1269.269928] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8813bb79-ba0d-47ca-9dfb-dd9b45be13cd {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1269.278614] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e937e69-bed8-48aa-b66d-9934b8beaa8c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1269.315758] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2094a01a-9834-42b8-9039-44d7e78711bd {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1269.323285] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c6aa554-81d8-459d-adcd-ef6f1559bbcd {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1269.336584] env[68906]: DEBUG nova.compute.provider_tree [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1269.349724] env[68906]: DEBUG nova.scheduler.client.report [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1269.362546] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.499s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1269.363063] env[68906]: DEBUG nova.compute.manager [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Start building networks asynchronously for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1269.365232] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.495s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1269.366581] env[68906]: INFO nova.compute.claims [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1269.400427] env[68906]: DEBUG nova.compute.utils [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Using /dev/sd instead of None {{(pid=68906) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1269.402100] env[68906]: DEBUG nova.compute.manager [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Allocating IP information in the background. {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1269.402271] env[68906]: DEBUG nova.network.neutron [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] allocate_for_instance() {{(pid=68906) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1269.408783] env[68906]: DEBUG nova.compute.manager [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Start building block device mappings for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1269.467628] env[68906]: DEBUG nova.policy [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5933dfaa93ea4f72a1a285f5ebd66ad9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f0d5a1687ee84efea1690fc284b60dd4', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68906) authorize /opt/stack/nova/nova/policy.py:203}} [ 1269.472375] env[68906]: DEBUG nova.compute.manager [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Start spawning the instance on the hypervisor. {{(pid=68906) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1269.497323] env[68906]: DEBUG nova.virt.hardware [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T13:00:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T13:00:23Z,direct_url=,disk_format='vmdk',id=b1400c31-d33b-4e13-944f-4c645e62493e,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='1ae7bf3a375d41c6af5e7536af51ffd1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T13:00:24Z,virtual_size=,visibility=), allow threads: False {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1269.497639] env[68906]: DEBUG nova.virt.hardware [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Flavor limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1269.497840] env[68906]: DEBUG nova.virt.hardware [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Image limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1269.498416] env[68906]: DEBUG nova.virt.hardware [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Flavor pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1269.498416] env[68906]: DEBUG nova.virt.hardware [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Image pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1269.498416] env[68906]: DEBUG nova.virt.hardware [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1269.498700] env[68906]: DEBUG nova.virt.hardware [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1269.498906] env[68906]: DEBUG nova.virt.hardware [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1269.499171] env[68906]: DEBUG nova.virt.hardware [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Got 1 possible topologies {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1269.499329] env[68906]: DEBUG nova.virt.hardware [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1269.499571] env[68906]: DEBUG nova.virt.hardware [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1269.501085] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e22226b8-54b4-4824-b26f-3980388bdc1c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1269.508862] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32747100-8da0-49e8-aa8b-c0389418cd3f {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1269.730735] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7fbe316c-0bda-446a-b1c5-d5d2f7d32f11 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1269.741851] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b569133-ad08-46cb-8253-e5971dce6ed6 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1269.779484] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-15a57edf-89c6-4c4c-b8ca-69010449b812 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1269.786721] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-036ffebd-c83e-4aae-a761-40122671bbd6 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1269.791232] env[68906]: DEBUG nova.network.neutron [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Successfully created port: 783bfa88-946f-4031-9311-1512b3c2be53 {{(pid=68906) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1269.803568] env[68906]: DEBUG nova.compute.provider_tree [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1269.815849] env[68906]: DEBUG nova.scheduler.client.report [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1269.831040] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.466s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1269.832025] env[68906]: DEBUG nova.compute.manager [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Start building networks asynchronously for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1269.870121] env[68906]: DEBUG nova.compute.utils [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Using /dev/sd instead of None {{(pid=68906) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1269.871703] env[68906]: DEBUG nova.compute.manager [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Not allocating networking since 'none' was specified. {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1269.888019] env[68906]: DEBUG nova.compute.manager [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Start building block device mappings for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1269.971676] env[68906]: DEBUG nova.compute.manager [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Start spawning the instance on the hypervisor. {{(pid=68906) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1270.002718] env[68906]: DEBUG nova.virt.hardware [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T13:00:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T13:00:23Z,direct_url=,disk_format='vmdk',id=b1400c31-d33b-4e13-944f-4c645e62493e,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='1ae7bf3a375d41c6af5e7536af51ffd1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T13:00:24Z,virtual_size=,visibility=), allow threads: False {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1270.002903] env[68906]: DEBUG nova.virt.hardware [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Flavor limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1270.003074] env[68906]: DEBUG nova.virt.hardware [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Image limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1270.003288] env[68906]: DEBUG nova.virt.hardware [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Flavor pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1270.005751] env[68906]: DEBUG nova.virt.hardware [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Image pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1270.005751] env[68906]: DEBUG nova.virt.hardware [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1270.005751] env[68906]: DEBUG nova.virt.hardware [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1270.005751] env[68906]: DEBUG nova.virt.hardware [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1270.005751] env[68906]: DEBUG nova.virt.hardware [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Got 1 possible topologies {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1270.006091] env[68906]: DEBUG nova.virt.hardware [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1270.006091] env[68906]: DEBUG nova.virt.hardware [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1270.006091] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6adf97f3-f558-469b-a53c-6e16f315953c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1270.014209] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-718d587f-4623-47df-a865-4b26778cad58 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1270.028355] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Instance VIF info [] {{(pid=68906) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1270.037913] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Creating folder: Project (dce599ead26f469698616c0e9aed4419). Parent ref: group-v694750. {{(pid=68906) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1270.038266] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-14b9420f-cbd6-45da-8c05-0c2f719d0f2c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1270.050512] env[68906]: INFO nova.virt.vmwareapi.vm_util [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Created folder: Project (dce599ead26f469698616c0e9aed4419) in parent group-v694750. [ 1270.050728] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Creating folder: Instances. Parent ref: group-v694819. {{(pid=68906) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1270.050959] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-6bac82de-a714-4968-84df-93dc51ba6e83 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1270.059716] env[68906]: INFO nova.virt.vmwareapi.vm_util [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Created folder: Instances in parent group-v694819. [ 1270.060673] env[68906]: DEBUG oslo.service.loopingcall [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1270.060673] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Creating VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1270.060673] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-450a27d9-c4c8-44a1-95bd-92a4b5f5d352 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1270.076413] env[68906]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1270.076413] env[68906]: value = "task-3475376" [ 1270.076413] env[68906]: _type = "Task" [ 1270.076413] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1270.083656] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475376, 'name': CreateVM_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1270.389184] env[68906]: DEBUG nova.compute.manager [req-3baa6173-1589-473e-af10-dd8cbbd01f70 req-f74aec73-33c0-4115-8702-f311826b5fcc service nova] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Received event network-vif-plugged-783bfa88-946f-4031-9311-1512b3c2be53 {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1270.389433] env[68906]: DEBUG oslo_concurrency.lockutils [req-3baa6173-1589-473e-af10-dd8cbbd01f70 req-f74aec73-33c0-4115-8702-f311826b5fcc service nova] Acquiring lock "6c28f571-e74a-48f9-9cc7-a9e4ddea8b09-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1270.389683] env[68906]: DEBUG oslo_concurrency.lockutils [req-3baa6173-1589-473e-af10-dd8cbbd01f70 req-f74aec73-33c0-4115-8702-f311826b5fcc service nova] Lock "6c28f571-e74a-48f9-9cc7-a9e4ddea8b09-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1270.389857] env[68906]: DEBUG oslo_concurrency.lockutils [req-3baa6173-1589-473e-af10-dd8cbbd01f70 req-f74aec73-33c0-4115-8702-f311826b5fcc service nova] Lock "6c28f571-e74a-48f9-9cc7-a9e4ddea8b09-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1270.390120] env[68906]: DEBUG nova.compute.manager [req-3baa6173-1589-473e-af10-dd8cbbd01f70 req-f74aec73-33c0-4115-8702-f311826b5fcc service nova] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] No waiting events found dispatching network-vif-plugged-783bfa88-946f-4031-9311-1512b3c2be53 {{(pid=68906) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1270.390331] env[68906]: WARNING nova.compute.manager [req-3baa6173-1589-473e-af10-dd8cbbd01f70 req-f74aec73-33c0-4115-8702-f311826b5fcc service nova] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Received unexpected event network-vif-plugged-783bfa88-946f-4031-9311-1512b3c2be53 for instance with vm_state building and task_state spawning. [ 1270.449784] env[68906]: DEBUG nova.network.neutron [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Successfully updated port: 783bfa88-946f-4031-9311-1512b3c2be53 {{(pid=68906) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1270.461946] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Acquiring lock "refresh_cache-6c28f571-e74a-48f9-9cc7-a9e4ddea8b09" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1270.462123] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Acquired lock "refresh_cache-6c28f571-e74a-48f9-9cc7-a9e4ddea8b09" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1270.462277] env[68906]: DEBUG nova.network.neutron [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Building network info cache for instance {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1270.501455] env[68906]: DEBUG nova.network.neutron [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Instance cache missing network info. {{(pid=68906) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1270.586597] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475376, 'name': CreateVM_Task, 'duration_secs': 0.253213} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1270.586793] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Created VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1270.587221] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1270.587382] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1270.587700] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1270.587937] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-cb4cb41e-4644-472d-ad8c-f5d29c6d27d7 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1270.591965] env[68906]: DEBUG oslo_vmware.api [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Waiting for the task: (returnval){ [ 1270.591965] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]521fc8f9-1016-5e5b-de13-5771e48d0d42" [ 1270.591965] env[68906]: _type = "Task" [ 1270.591965] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1270.599086] env[68906]: DEBUG oslo_vmware.api [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]521fc8f9-1016-5e5b-de13-5771e48d0d42, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1270.807636] env[68906]: DEBUG nova.network.neutron [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Updating instance_info_cache with network_info: [{"id": "783bfa88-946f-4031-9311-1512b3c2be53", "address": "fa:16:3e:19:98:29", "network": {"id": "a314631a-365d-43a7-85c2-97777cb4546b", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-1570205564-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f0d5a1687ee84efea1690fc284b60dd4", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "41f66e20-fd86-4158-bbdc-7a150e85e844", "external-id": "nsx-vlan-transportzone-182", "segmentation_id": 182, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap783bfa88-94", "ovs_interfaceid": "783bfa88-946f-4031-9311-1512b3c2be53", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1270.822558] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Releasing lock "refresh_cache-6c28f571-e74a-48f9-9cc7-a9e4ddea8b09" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1270.822874] env[68906]: DEBUG nova.compute.manager [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Instance network_info: |[{"id": "783bfa88-946f-4031-9311-1512b3c2be53", "address": "fa:16:3e:19:98:29", "network": {"id": "a314631a-365d-43a7-85c2-97777cb4546b", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-1570205564-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f0d5a1687ee84efea1690fc284b60dd4", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "41f66e20-fd86-4158-bbdc-7a150e85e844", "external-id": "nsx-vlan-transportzone-182", "segmentation_id": 182, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap783bfa88-94", "ovs_interfaceid": "783bfa88-946f-4031-9311-1512b3c2be53", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1270.823277] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:19:98:29', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '41f66e20-fd86-4158-bbdc-7a150e85e844', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '783bfa88-946f-4031-9311-1512b3c2be53', 'vif_model': 'vmxnet3'}] {{(pid=68906) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1270.830683] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Creating folder: Project (f0d5a1687ee84efea1690fc284b60dd4). Parent ref: group-v694750. {{(pid=68906) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1270.831190] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-4f4e5dbb-915a-4ba6-90a9-4bf22ae60c35 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1270.840767] env[68906]: INFO nova.virt.vmwareapi.vm_util [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Created folder: Project (f0d5a1687ee84efea1690fc284b60dd4) in parent group-v694750. [ 1270.840946] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Creating folder: Instances. Parent ref: group-v694822. {{(pid=68906) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1270.841167] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-bb38ce30-9711-4a46-9fad-e801ad242135 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1270.849657] env[68906]: INFO nova.virt.vmwareapi.vm_util [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Created folder: Instances in parent group-v694822. [ 1270.849876] env[68906]: DEBUG oslo.service.loopingcall [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1270.850066] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Creating VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1270.850253] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-074e2a24-8fc2-4aa5-8c12-27206960501c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1270.868390] env[68906]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1270.868390] env[68906]: value = "task-3475379" [ 1270.868390] env[68906]: _type = "Task" [ 1270.868390] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1270.875731] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475379, 'name': CreateVM_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1271.102463] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1271.102761] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Processing image b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1271.103016] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1271.378114] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475379, 'name': CreateVM_Task, 'duration_secs': 0.277991} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1271.378294] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Created VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1271.385495] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1271.385722] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1271.385969] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1271.386221] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9e69c378-061c-4e26-85e5-c26dfca450f9 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1271.390767] env[68906]: DEBUG oslo_vmware.api [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Waiting for the task: (returnval){ [ 1271.390767] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]523b157e-a41b-48ed-9d43-fc19c5c72be1" [ 1271.390767] env[68906]: _type = "Task" [ 1271.390767] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1271.398999] env[68906]: DEBUG oslo_vmware.api [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]523b157e-a41b-48ed-9d43-fc19c5c72be1, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1271.901236] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1271.901542] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Processing image b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1271.901698] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1272.447558] env[68906]: DEBUG nova.compute.manager [req-a835ae81-7f45-45d3-a35c-ca7487dc624b req-0745f22d-90c4-4ebd-8296-73d429500c62 service nova] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Received event network-changed-783bfa88-946f-4031-9311-1512b3c2be53 {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1272.447697] env[68906]: DEBUG nova.compute.manager [req-a835ae81-7f45-45d3-a35c-ca7487dc624b req-0745f22d-90c4-4ebd-8296-73d429500c62 service nova] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Refreshing instance network info cache due to event network-changed-783bfa88-946f-4031-9311-1512b3c2be53. {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1272.447972] env[68906]: DEBUG oslo_concurrency.lockutils [req-a835ae81-7f45-45d3-a35c-ca7487dc624b req-0745f22d-90c4-4ebd-8296-73d429500c62 service nova] Acquiring lock "refresh_cache-6c28f571-e74a-48f9-9cc7-a9e4ddea8b09" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1272.448191] env[68906]: DEBUG oslo_concurrency.lockutils [req-a835ae81-7f45-45d3-a35c-ca7487dc624b req-0745f22d-90c4-4ebd-8296-73d429500c62 service nova] Acquired lock "refresh_cache-6c28f571-e74a-48f9-9cc7-a9e4ddea8b09" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1272.448271] env[68906]: DEBUG nova.network.neutron [req-a835ae81-7f45-45d3-a35c-ca7487dc624b req-0745f22d-90c4-4ebd-8296-73d429500c62 service nova] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Refreshing network info cache for port 783bfa88-946f-4031-9311-1512b3c2be53 {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1272.699628] env[68906]: DEBUG nova.network.neutron [req-a835ae81-7f45-45d3-a35c-ca7487dc624b req-0745f22d-90c4-4ebd-8296-73d429500c62 service nova] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Updated VIF entry in instance network info cache for port 783bfa88-946f-4031-9311-1512b3c2be53. {{(pid=68906) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1272.699990] env[68906]: DEBUG nova.network.neutron [req-a835ae81-7f45-45d3-a35c-ca7487dc624b req-0745f22d-90c4-4ebd-8296-73d429500c62 service nova] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Updating instance_info_cache with network_info: [{"id": "783bfa88-946f-4031-9311-1512b3c2be53", "address": "fa:16:3e:19:98:29", "network": {"id": "a314631a-365d-43a7-85c2-97777cb4546b", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-1570205564-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "f0d5a1687ee84efea1690fc284b60dd4", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "41f66e20-fd86-4158-bbdc-7a150e85e844", "external-id": "nsx-vlan-transportzone-182", "segmentation_id": 182, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap783bfa88-94", "ovs_interfaceid": "783bfa88-946f-4031-9311-1512b3c2be53", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1272.709296] env[68906]: DEBUG oslo_concurrency.lockutils [req-a835ae81-7f45-45d3-a35c-ca7487dc624b req-0745f22d-90c4-4ebd-8296-73d429500c62 service nova] Releasing lock "refresh_cache-6c28f571-e74a-48f9-9cc7-a9e4ddea8b09" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1277.012339] env[68906]: DEBUG oslo_concurrency.lockutils [None req-df13a5bb-f088-42b3-ad46-a3d1262b27bf tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Acquiring lock "6c28f571-e74a-48f9-9cc7-a9e4ddea8b09" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1282.842186] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6f158b16-c95b-47ba-98ef-cd5e461b3344 tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Acquiring lock "e0e595e3-e47e-4cf1-8977-f004eca942d1" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1287.142327] env[68906]: DEBUG oslo_concurrency.lockutils [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Acquiring lock "aed06616-d008-4695-b66e-9f40acf5ebd3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1287.142650] env[68906]: DEBUG oslo_concurrency.lockutils [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Lock "aed06616-d008-4695-b66e-9f40acf5ebd3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1291.822200] env[68906]: DEBUG oslo_concurrency.lockutils [None req-18f49ac7-7eb5-48de-bf53-912002f6914c tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] Acquiring lock "c3386804-6ed9-46fe-b26d-3b5aae52c84b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1291.822505] env[68906]: DEBUG oslo_concurrency.lockutils [None req-18f49ac7-7eb5-48de-bf53-912002f6914c tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] Lock "c3386804-6ed9-46fe-b26d-3b5aae52c84b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1297.069707] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1298.140165] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1301.139998] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1303.140895] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1303.141298] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Starting heal instance info cache {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1303.141298] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Rebuilding the list of instances to heal {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1303.163013] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1303.163180] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: acc11633-a489-4d8f-ad76-f17049a91545] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1303.163297] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1303.163420] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1303.163539] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1303.163659] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1303.163777] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: db011373-7285-4882-8bce-d39cfa22fe80] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1303.163894] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1303.164017] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1303.164134] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1303.164252] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Didn't find any instances for network info cache update. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1303.164731] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1303.164914] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1303.165061] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68906) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1305.140974] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1305.141362] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1307.140915] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1308.141915] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager.update_available_resource {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1308.156284] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1308.156648] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1308.156928] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1308.157851] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68906) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1308.158926] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ec5f743-440e-4924-82d6-ccfc804137b9 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1308.171322] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-680fc021-8b0e-464e-85c1-9583fc7155fa {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1308.189596] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0fb6d4eb-02f2-4e97-b4ef-d853e6efd9aa {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1308.198371] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-104f3d9b-6953-408b-9144-7717850c6712 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1308.228548] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180921MB free_disk=93GB free_vcpus=48 pci_devices=None {{(pid=68906) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1308.228718] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1308.229300] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1308.309080] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1308.309262] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance acc11633-a489-4d8f-ad76-f17049a91545 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1308.309392] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance e7286888-d79d-4632-9c06-69c1ef47fa50 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1308.309517] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 641cca5b-d749-4331-a5e0-8acb6d47cba2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1308.309637] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1308.309757] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 4d36bb91-0cde-44cb-8706-d17740a9cf50 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1308.309962] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance db011373-7285-4882-8bce-d39cfa22fe80 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1308.310085] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 1fdb401a-ac25-4418-803c-fc0b2297f2d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1308.310151] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1308.310252] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance e0e595e3-e47e-4cf1-8977-f004eca942d1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1308.322287] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 7466df8a-59a9-49b9-bff7-c4efbeae3eee has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1308.333390] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 89171680-c76d-4826-9236-379542661ffb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1308.344256] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 9b884416-df89-4d8c-b2ab-0667db52a718 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1308.354616] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 917ba3c3-9188-40fa-be6c-cdab27b76970 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1308.364804] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 7803f951-a0c0-4246-b2d9-3eabadfa679d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1308.375796] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 8a4e18b6-55c0-4397-b570-27db4541e9b3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1308.387132] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 2b688987-d4cf-4ebb-83c4-d5fa7f5bcbb9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1308.397482] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 3ce59687-c677-40bd-8af4-c2f4b576e86e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1308.408294] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 45c0d7ba-6d21-46d1-8bcb-0318bd93f885 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1308.419088] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 75a4f8bc-09aa-4c9b-b705-fb84ddcf60ea has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1308.430607] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance ff99f1e3-9a4a-487e-afcb-6d8439a0491d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1308.441308] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance aed06616-d008-4695-b66e-9f40acf5ebd3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1308.452179] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance c3386804-6ed9-46fe-b26d-3b5aae52c84b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1308.452421] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1308.452567] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1308.757547] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab53c22e-410c-4364-8f15-e28fba11be05 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1308.765819] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-868072c8-c6a8-451b-948b-11acc94c1f30 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1308.806621] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6edff2a8-05ac-43d6-8742-37d4d56c5007 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1308.817473] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32e777e4-f402-45ce-bd9b-438da58e26b3 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1308.838486] env[68906]: DEBUG nova.compute.provider_tree [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1308.847806] env[68906]: DEBUG nova.scheduler.client.report [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1308.863460] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68906) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1308.863778] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.635s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1315.857021] env[68906]: WARNING oslo_vmware.rw_handles [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1315.857021] env[68906]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1315.857021] env[68906]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1315.857021] env[68906]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1315.857021] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1315.857021] env[68906]: ERROR oslo_vmware.rw_handles response.begin() [ 1315.857021] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1315.857021] env[68906]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1315.857021] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1315.857021] env[68906]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1315.857021] env[68906]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1315.857021] env[68906]: ERROR oslo_vmware.rw_handles [ 1315.857021] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Downloaded image file data b1400c31-d33b-4e13-944f-4c645e62493e to vmware_temp/2b63fbd5-de43-447a-b5c2-d0ff342edee0/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1315.857731] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Caching image {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1315.857731] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Copying Virtual Disk [datastore2] vmware_temp/2b63fbd5-de43-447a-b5c2-d0ff342edee0/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk to [datastore2] vmware_temp/2b63fbd5-de43-447a-b5c2-d0ff342edee0/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk {{(pid=68906) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1315.857731] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-e27162cd-a33b-4db2-b3d5-0a4b82cf38ab {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1315.868521] env[68906]: DEBUG oslo_vmware.api [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Waiting for the task: (returnval){ [ 1315.868521] env[68906]: value = "task-3475380" [ 1315.868521] env[68906]: _type = "Task" [ 1315.868521] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1315.876634] env[68906]: DEBUG oslo_vmware.api [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Task: {'id': task-3475380, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1316.378506] env[68906]: DEBUG oslo_vmware.exceptions [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Fault InvalidArgument not matched. {{(pid=68906) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1316.379564] env[68906]: DEBUG oslo_concurrency.lockutils [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1316.379647] env[68906]: ERROR nova.compute.manager [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1316.379647] env[68906]: Faults: ['InvalidArgument'] [ 1316.379647] env[68906]: ERROR nova.compute.manager [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Traceback (most recent call last): [ 1316.379647] env[68906]: ERROR nova.compute.manager [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1316.379647] env[68906]: ERROR nova.compute.manager [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] yield resources [ 1316.379647] env[68906]: ERROR nova.compute.manager [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1316.379647] env[68906]: ERROR nova.compute.manager [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] self.driver.spawn(context, instance, image_meta, [ 1316.379647] env[68906]: ERROR nova.compute.manager [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1316.379647] env[68906]: ERROR nova.compute.manager [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1316.379647] env[68906]: ERROR nova.compute.manager [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1316.379647] env[68906]: ERROR nova.compute.manager [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] self._fetch_image_if_missing(context, vi) [ 1316.379647] env[68906]: ERROR nova.compute.manager [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1316.380097] env[68906]: ERROR nova.compute.manager [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] image_cache(vi, tmp_image_ds_loc) [ 1316.380097] env[68906]: ERROR nova.compute.manager [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1316.380097] env[68906]: ERROR nova.compute.manager [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] vm_util.copy_virtual_disk( [ 1316.380097] env[68906]: ERROR nova.compute.manager [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1316.380097] env[68906]: ERROR nova.compute.manager [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] session._wait_for_task(vmdk_copy_task) [ 1316.380097] env[68906]: ERROR nova.compute.manager [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1316.380097] env[68906]: ERROR nova.compute.manager [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] return self.wait_for_task(task_ref) [ 1316.380097] env[68906]: ERROR nova.compute.manager [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1316.380097] env[68906]: ERROR nova.compute.manager [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] return evt.wait() [ 1316.380097] env[68906]: ERROR nova.compute.manager [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1316.380097] env[68906]: ERROR nova.compute.manager [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] result = hub.switch() [ 1316.380097] env[68906]: ERROR nova.compute.manager [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1316.380097] env[68906]: ERROR nova.compute.manager [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] return self.greenlet.switch() [ 1316.380514] env[68906]: ERROR nova.compute.manager [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1316.380514] env[68906]: ERROR nova.compute.manager [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] self.f(*self.args, **self.kw) [ 1316.380514] env[68906]: ERROR nova.compute.manager [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1316.380514] env[68906]: ERROR nova.compute.manager [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] raise exceptions.translate_fault(task_info.error) [ 1316.380514] env[68906]: ERROR nova.compute.manager [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1316.380514] env[68906]: ERROR nova.compute.manager [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Faults: ['InvalidArgument'] [ 1316.380514] env[68906]: ERROR nova.compute.manager [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] [ 1316.380514] env[68906]: INFO nova.compute.manager [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Terminating instance [ 1316.381823] env[68906]: DEBUG oslo_concurrency.lockutils [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1316.382036] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1316.382272] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4ac00bde-b102-41f2-b94e-1000c7fdd7f5 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1316.384268] env[68906]: DEBUG nova.compute.manager [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1316.384460] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1316.385159] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c564c27a-af6a-4611-b489-a6bff93468ef {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1316.391675] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Unregistering the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1316.391872] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-d7fd4014-56f1-42cd-b970-109b9b8683d4 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1316.393902] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1316.394076] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68906) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1316.394973] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6b293bdf-04f4-4bf8-98a5-eb937958940e {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1316.399704] env[68906]: DEBUG oslo_vmware.api [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Waiting for the task: (returnval){ [ 1316.399704] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]5276b99f-fb96-def3-78e4-8a281903964e" [ 1316.399704] env[68906]: _type = "Task" [ 1316.399704] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1316.406635] env[68906]: DEBUG oslo_vmware.api [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]5276b99f-fb96-def3-78e4-8a281903964e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1316.457733] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Unregistered the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1316.457938] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Deleting contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1316.458135] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Deleting the datastore file [datastore2] 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63 {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1316.458389] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-4ab8b7c4-5833-4941-b9e5-21d93e6e1dba {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1316.465026] env[68906]: DEBUG oslo_vmware.api [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Waiting for the task: (returnval){ [ 1316.465026] env[68906]: value = "task-3475382" [ 1316.465026] env[68906]: _type = "Task" [ 1316.465026] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1316.474012] env[68906]: DEBUG oslo_vmware.api [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Task: {'id': task-3475382, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1316.909906] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] [instance: acc11633-a489-4d8f-ad76-f17049a91545] Preparing fetch location {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1316.910203] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Creating directory with path [datastore2] vmware_temp/dd01627f-54f8-4add-8675-5497474dcf23/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1316.910410] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-98f4d279-2814-47cf-abe1-4411d330d878 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1316.938790] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Created directory with path [datastore2] vmware_temp/dd01627f-54f8-4add-8675-5497474dcf23/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1316.938790] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] [instance: acc11633-a489-4d8f-ad76-f17049a91545] Fetch image to [datastore2] vmware_temp/dd01627f-54f8-4add-8675-5497474dcf23/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1316.938790] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] [instance: acc11633-a489-4d8f-ad76-f17049a91545] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to [datastore2] vmware_temp/dd01627f-54f8-4add-8675-5497474dcf23/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1316.938790] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ac6de8d-cf13-42a0-a12c-77fe5a5a8980 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1316.939368] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-144ec397-b135-42b3-aa68-0750678432fa {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1316.942624] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf3c0769-7250-4659-8a30-0c2b0026f63f {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1316.977403] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-73350f30-f19a-427d-81a3-bf57c334c2a0 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1316.985791] env[68906]: DEBUG oslo_vmware.api [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Task: {'id': task-3475382, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.076438} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1316.987459] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Deleted the datastore file {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1316.987679] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Deleted contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1316.987860] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1316.988047] env[68906]: INFO nova.compute.manager [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1316.989870] env[68906]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-38b7a77c-6b1e-4548-b24b-ad23a6f30290 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1316.991954] env[68906]: DEBUG nova.compute.claims [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Aborting claim: {{(pid=68906) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1316.992153] env[68906]: DEBUG oslo_concurrency.lockutils [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1316.992748] env[68906]: DEBUG oslo_concurrency.lockutils [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1317.014075] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] [instance: acc11633-a489-4d8f-ad76-f17049a91545] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1317.083741] env[68906]: DEBUG oslo_vmware.rw_handles [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/dd01627f-54f8-4add-8675-5497474dcf23/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1317.146167] env[68906]: DEBUG oslo_vmware.rw_handles [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Completed reading data from the image iterator. {{(pid=68906) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1317.146359] env[68906]: DEBUG oslo_vmware.rw_handles [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/dd01627f-54f8-4add-8675-5497474dcf23/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1317.371434] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-be874a87-060c-4477-85df-d00c05735fdc {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1317.379078] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-444cc823-66e0-45d8-9662-495ba4352a84 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1317.407853] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec18d3cc-0f7d-4152-bafc-25609c5c7b2e {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1317.414487] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-12aeeb31-c96b-46f2-afa7-4efeeedc81f7 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1317.427613] env[68906]: DEBUG nova.compute.provider_tree [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1317.436523] env[68906]: DEBUG nova.scheduler.client.report [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1317.449252] env[68906]: DEBUG oslo_concurrency.lockutils [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.457s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1317.449729] env[68906]: ERROR nova.compute.manager [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1317.449729] env[68906]: Faults: ['InvalidArgument'] [ 1317.449729] env[68906]: ERROR nova.compute.manager [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Traceback (most recent call last): [ 1317.449729] env[68906]: ERROR nova.compute.manager [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1317.449729] env[68906]: ERROR nova.compute.manager [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] self.driver.spawn(context, instance, image_meta, [ 1317.449729] env[68906]: ERROR nova.compute.manager [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1317.449729] env[68906]: ERROR nova.compute.manager [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1317.449729] env[68906]: ERROR nova.compute.manager [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1317.449729] env[68906]: ERROR nova.compute.manager [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] self._fetch_image_if_missing(context, vi) [ 1317.449729] env[68906]: ERROR nova.compute.manager [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1317.449729] env[68906]: ERROR nova.compute.manager [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] image_cache(vi, tmp_image_ds_loc) [ 1317.449729] env[68906]: ERROR nova.compute.manager [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1317.449989] env[68906]: ERROR nova.compute.manager [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] vm_util.copy_virtual_disk( [ 1317.449989] env[68906]: ERROR nova.compute.manager [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1317.449989] env[68906]: ERROR nova.compute.manager [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] session._wait_for_task(vmdk_copy_task) [ 1317.449989] env[68906]: ERROR nova.compute.manager [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1317.449989] env[68906]: ERROR nova.compute.manager [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] return self.wait_for_task(task_ref) [ 1317.449989] env[68906]: ERROR nova.compute.manager [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1317.449989] env[68906]: ERROR nova.compute.manager [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] return evt.wait() [ 1317.449989] env[68906]: ERROR nova.compute.manager [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1317.449989] env[68906]: ERROR nova.compute.manager [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] result = hub.switch() [ 1317.449989] env[68906]: ERROR nova.compute.manager [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1317.449989] env[68906]: ERROR nova.compute.manager [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] return self.greenlet.switch() [ 1317.449989] env[68906]: ERROR nova.compute.manager [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1317.449989] env[68906]: ERROR nova.compute.manager [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] self.f(*self.args, **self.kw) [ 1317.450506] env[68906]: ERROR nova.compute.manager [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1317.450506] env[68906]: ERROR nova.compute.manager [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] raise exceptions.translate_fault(task_info.error) [ 1317.450506] env[68906]: ERROR nova.compute.manager [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1317.450506] env[68906]: ERROR nova.compute.manager [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Faults: ['InvalidArgument'] [ 1317.450506] env[68906]: ERROR nova.compute.manager [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] [ 1317.450506] env[68906]: DEBUG nova.compute.utils [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] VimFaultException {{(pid=68906) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1317.451768] env[68906]: DEBUG nova.compute.manager [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Build of instance 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63 was re-scheduled: A specified parameter was not correct: fileType [ 1317.451768] env[68906]: Faults: ['InvalidArgument'] {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1317.452155] env[68906]: DEBUG nova.compute.manager [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Unplugging VIFs for instance {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1317.452331] env[68906]: DEBUG nova.compute.manager [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1317.452499] env[68906]: DEBUG nova.compute.manager [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1317.452660] env[68906]: DEBUG nova.network.neutron [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1318.079105] env[68906]: DEBUG nova.network.neutron [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1318.090560] env[68906]: INFO nova.compute.manager [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Took 0.64 seconds to deallocate network for instance. [ 1318.189720] env[68906]: INFO nova.scheduler.client.report [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Deleted allocations for instance 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63 [ 1318.212239] env[68906]: DEBUG oslo_concurrency.lockutils [None req-236f039b-07c6-4a4d-a7d0-3f9cc96aa05c tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Lock "91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 632.075s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1318.213736] env[68906]: DEBUG oslo_concurrency.lockutils [None req-97a35496-b318-4c52-a18b-dac8228b3518 tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Lock "91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 432.493s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1318.213995] env[68906]: DEBUG oslo_concurrency.lockutils [None req-97a35496-b318-4c52-a18b-dac8228b3518 tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Acquiring lock "91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1318.214227] env[68906]: DEBUG oslo_concurrency.lockutils [None req-97a35496-b318-4c52-a18b-dac8228b3518 tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Lock "91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1318.214583] env[68906]: DEBUG oslo_concurrency.lockutils [None req-97a35496-b318-4c52-a18b-dac8228b3518 tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Lock "91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1318.216603] env[68906]: INFO nova.compute.manager [None req-97a35496-b318-4c52-a18b-dac8228b3518 tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Terminating instance [ 1318.218310] env[68906]: DEBUG nova.compute.manager [None req-97a35496-b318-4c52-a18b-dac8228b3518 tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1318.218507] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-97a35496-b318-4c52-a18b-dac8228b3518 tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1318.218994] env[68906]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-70768bff-b5d7-40ce-b51f-5a2f3226715a {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1318.226463] env[68906]: DEBUG nova.compute.manager [None req-1a3cd471-afd5-4758-9ba7-114ed58755e9 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: 3c36e8a4-da45-457e-b4ef-001f4a4e595f] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1318.231117] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b6a86d6d-bbde-4a1c-8f8b-4fc68a21e6d0 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1318.260543] env[68906]: WARNING nova.virt.vmwareapi.vmops [None req-97a35496-b318-4c52-a18b-dac8228b3518 tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63 could not be found. [ 1318.260742] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-97a35496-b318-4c52-a18b-dac8228b3518 tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1318.260911] env[68906]: INFO nova.compute.manager [None req-97a35496-b318-4c52-a18b-dac8228b3518 tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1318.261163] env[68906]: DEBUG oslo.service.loopingcall [None req-97a35496-b318-4c52-a18b-dac8228b3518 tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1318.261601] env[68906]: DEBUG nova.compute.manager [None req-1a3cd471-afd5-4758-9ba7-114ed58755e9 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: 3c36e8a4-da45-457e-b4ef-001f4a4e595f] Instance disappeared before build. {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1318.262467] env[68906]: DEBUG nova.compute.manager [-] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1318.262565] env[68906]: DEBUG nova.network.neutron [-] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1318.287325] env[68906]: DEBUG oslo_concurrency.lockutils [None req-1a3cd471-afd5-4758-9ba7-114ed58755e9 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Lock "3c36e8a4-da45-457e-b4ef-001f4a4e595f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 227.664s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1318.289008] env[68906]: DEBUG nova.network.neutron [-] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1318.299155] env[68906]: INFO nova.compute.manager [-] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] Took 0.04 seconds to deallocate network for instance. [ 1318.301217] env[68906]: DEBUG nova.compute.manager [None req-7e5b0fff-7692-45bc-b660-9f08afcd6b69 tempest-AttachVolumeShelveTestJSON-1059946953 tempest-AttachVolumeShelveTestJSON-1059946953-project-member] [instance: 582a086e-5122-41f2-8fb8-513b3734eef4] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1318.326761] env[68906]: DEBUG nova.compute.manager [None req-7e5b0fff-7692-45bc-b660-9f08afcd6b69 tempest-AttachVolumeShelveTestJSON-1059946953 tempest-AttachVolumeShelveTestJSON-1059946953-project-member] [instance: 582a086e-5122-41f2-8fb8-513b3734eef4] Instance disappeared before build. {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1318.350997] env[68906]: DEBUG oslo_concurrency.lockutils [None req-7e5b0fff-7692-45bc-b660-9f08afcd6b69 tempest-AttachVolumeShelveTestJSON-1059946953 tempest-AttachVolumeShelveTestJSON-1059946953-project-member] Lock "582a086e-5122-41f2-8fb8-513b3734eef4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 226.742s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1318.361470] env[68906]: DEBUG nova.compute.manager [None req-b0388411-7fe1-4e49-b7da-6e4027223a15 tempest-ServerRescueTestJSON-1075537064 tempest-ServerRescueTestJSON-1075537064-project-member] [instance: 159edc16-55bb-46eb-8fa9-7da7c1f36cd0] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1318.390159] env[68906]: DEBUG nova.compute.manager [None req-b0388411-7fe1-4e49-b7da-6e4027223a15 tempest-ServerRescueTestJSON-1075537064 tempest-ServerRescueTestJSON-1075537064-project-member] [instance: 159edc16-55bb-46eb-8fa9-7da7c1f36cd0] Instance disappeared before build. {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1318.408718] env[68906]: DEBUG oslo_concurrency.lockutils [None req-97a35496-b318-4c52-a18b-dac8228b3518 tempest-AttachInterfacesV270Test-1110429519 tempest-AttachInterfacesV270Test-1110429519-project-member] Lock "91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.195s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1318.409891] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 107.458s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1318.410432] env[68906]: INFO nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63] During sync_power_state the instance has a pending task (deleting). Skip. [ 1318.410653] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "91d405f9-5ad1-4e8e-9a32-1a5ef81ecc63" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1318.416625] env[68906]: DEBUG oslo_concurrency.lockutils [None req-b0388411-7fe1-4e49-b7da-6e4027223a15 tempest-ServerRescueTestJSON-1075537064 tempest-ServerRescueTestJSON-1075537064-project-member] Lock "159edc16-55bb-46eb-8fa9-7da7c1f36cd0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 226.805s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1318.424960] env[68906]: DEBUG nova.compute.manager [None req-0b200139-c804-48cc-b35b-ce4dd8cb7f66 tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] [instance: 13b471c5-c86e-4b55-a231-159b2219de2f] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1318.447794] env[68906]: DEBUG nova.compute.manager [None req-0b200139-c804-48cc-b35b-ce4dd8cb7f66 tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] [instance: 13b471c5-c86e-4b55-a231-159b2219de2f] Instance disappeared before build. {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1318.469043] env[68906]: DEBUG oslo_concurrency.lockutils [None req-0b200139-c804-48cc-b35b-ce4dd8cb7f66 tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] Lock "13b471c5-c86e-4b55-a231-159b2219de2f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 223.783s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1318.476938] env[68906]: DEBUG nova.compute.manager [None req-4c548662-eee4-4e29-93f1-9fe857d3d075 tempest-ServerRescueTestJSONUnderV235-1338611176 tempest-ServerRescueTestJSONUnderV235-1338611176-project-member] [instance: d01b8b11-bc3b-47dc-8687-a111c1453ed9] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1318.499038] env[68906]: DEBUG nova.compute.manager [None req-4c548662-eee4-4e29-93f1-9fe857d3d075 tempest-ServerRescueTestJSONUnderV235-1338611176 tempest-ServerRescueTestJSONUnderV235-1338611176-project-member] [instance: d01b8b11-bc3b-47dc-8687-a111c1453ed9] Instance disappeared before build. {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1318.518082] env[68906]: DEBUG oslo_concurrency.lockutils [None req-4c548662-eee4-4e29-93f1-9fe857d3d075 tempest-ServerRescueTestJSONUnderV235-1338611176 tempest-ServerRescueTestJSONUnderV235-1338611176-project-member] Lock "d01b8b11-bc3b-47dc-8687-a111c1453ed9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 216.176s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1318.525900] env[68906]: DEBUG nova.compute.manager [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1318.576512] env[68906]: DEBUG oslo_concurrency.lockutils [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1318.576768] env[68906]: DEBUG oslo_concurrency.lockutils [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1318.578231] env[68906]: INFO nova.compute.claims [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1318.864319] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4462701a-5ff7-4136-a15a-a10b1e13d8b2 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1318.871929] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dc2355c9-886c-4596-9741-86ecd519286c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1318.901802] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-833dc883-cbb9-49d8-99a1-51e0e204c874 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1318.908838] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f56ece6-9d25-4fad-a459-dd49f378ff5d {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1318.921571] env[68906]: DEBUG nova.compute.provider_tree [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1318.931454] env[68906]: DEBUG nova.scheduler.client.report [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1318.945404] env[68906]: DEBUG oslo_concurrency.lockutils [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.369s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1318.958640] env[68906]: DEBUG oslo_concurrency.lockutils [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Acquiring lock "8d0c4353-5214-4ff0-9f9c-4db951aba9fd" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.._do_validation" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1318.958874] env[68906]: DEBUG oslo_concurrency.lockutils [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Lock "8d0c4353-5214-4ff0-9f9c-4db951aba9fd" acquired by "nova.compute.manager.ComputeManager._validate_instance_group_policy.._do_validation" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1318.963988] env[68906]: DEBUG oslo_concurrency.lockutils [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Lock "8d0c4353-5214-4ff0-9f9c-4db951aba9fd" "released" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.._do_validation" :: held 0.005s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1318.964470] env[68906]: DEBUG nova.compute.manager [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Start building networks asynchronously for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1318.994916] env[68906]: DEBUG nova.compute.utils [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Using /dev/sd instead of None {{(pid=68906) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1318.996147] env[68906]: DEBUG nova.compute.manager [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Allocating IP information in the background. {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1318.996327] env[68906]: DEBUG nova.network.neutron [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] allocate_for_instance() {{(pid=68906) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1319.008657] env[68906]: DEBUG nova.compute.manager [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Start building block device mappings for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1319.065355] env[68906]: DEBUG nova.policy [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7d143d4e41d6430997da0e6207a8b959', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e497eae9d645470590700225f0cc0e1f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68906) authorize /opt/stack/nova/nova/policy.py:203}} [ 1319.072082] env[68906]: DEBUG nova.compute.manager [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Start spawning the instance on the hypervisor. {{(pid=68906) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1319.096790] env[68906]: DEBUG nova.virt.hardware [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T13:00:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T13:00:23Z,direct_url=,disk_format='vmdk',id=b1400c31-d33b-4e13-944f-4c645e62493e,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='1ae7bf3a375d41c6af5e7536af51ffd1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T13:00:24Z,virtual_size=,visibility=), allow threads: False {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1319.097155] env[68906]: DEBUG nova.virt.hardware [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Flavor limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1319.097211] env[68906]: DEBUG nova.virt.hardware [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Image limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1319.097372] env[68906]: DEBUG nova.virt.hardware [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Flavor pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1319.098348] env[68906]: DEBUG nova.virt.hardware [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Image pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1319.098439] env[68906]: DEBUG nova.virt.hardware [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1319.098652] env[68906]: DEBUG nova.virt.hardware [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1319.098821] env[68906]: DEBUG nova.virt.hardware [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1319.099031] env[68906]: DEBUG nova.virt.hardware [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Got 1 possible topologies {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1319.099225] env[68906]: DEBUG nova.virt.hardware [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1319.099425] env[68906]: DEBUG nova.virt.hardware [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1319.100342] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-680b29d4-e61d-48eb-a8f1-95bc65b4d6f7 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1319.109818] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ed95218-2922-4dfe-b9dc-71c0a3742b4b {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1319.363394] env[68906]: DEBUG nova.network.neutron [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Successfully created port: e14da2ca-620f-4b82-abe6-2aaf88d0a8a4 {{(pid=68906) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1319.919125] env[68906]: DEBUG nova.network.neutron [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Successfully updated port: e14da2ca-620f-4b82-abe6-2aaf88d0a8a4 {{(pid=68906) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1319.931217] env[68906]: DEBUG oslo_concurrency.lockutils [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Acquiring lock "refresh_cache-7466df8a-59a9-49b9-bff7-c4efbeae3eee" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1319.931382] env[68906]: DEBUG oslo_concurrency.lockutils [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Acquired lock "refresh_cache-7466df8a-59a9-49b9-bff7-c4efbeae3eee" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1319.931550] env[68906]: DEBUG nova.network.neutron [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Building network info cache for instance {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1319.972067] env[68906]: DEBUG nova.network.neutron [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Instance cache missing network info. {{(pid=68906) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1320.112303] env[68906]: DEBUG nova.compute.manager [req-1124299a-e4ed-45bd-815e-15e2312813ea req-96b8b1cc-77fb-41ee-a96d-05b5ca505835 service nova] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Received event network-vif-plugged-e14da2ca-620f-4b82-abe6-2aaf88d0a8a4 {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1320.112558] env[68906]: DEBUG oslo_concurrency.lockutils [req-1124299a-e4ed-45bd-815e-15e2312813ea req-96b8b1cc-77fb-41ee-a96d-05b5ca505835 service nova] Acquiring lock "7466df8a-59a9-49b9-bff7-c4efbeae3eee-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1320.112729] env[68906]: DEBUG oslo_concurrency.lockutils [req-1124299a-e4ed-45bd-815e-15e2312813ea req-96b8b1cc-77fb-41ee-a96d-05b5ca505835 service nova] Lock "7466df8a-59a9-49b9-bff7-c4efbeae3eee-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1320.112897] env[68906]: DEBUG oslo_concurrency.lockutils [req-1124299a-e4ed-45bd-815e-15e2312813ea req-96b8b1cc-77fb-41ee-a96d-05b5ca505835 service nova] Lock "7466df8a-59a9-49b9-bff7-c4efbeae3eee-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1320.113085] env[68906]: DEBUG nova.compute.manager [req-1124299a-e4ed-45bd-815e-15e2312813ea req-96b8b1cc-77fb-41ee-a96d-05b5ca505835 service nova] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] No waiting events found dispatching network-vif-plugged-e14da2ca-620f-4b82-abe6-2aaf88d0a8a4 {{(pid=68906) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1320.113255] env[68906]: WARNING nova.compute.manager [req-1124299a-e4ed-45bd-815e-15e2312813ea req-96b8b1cc-77fb-41ee-a96d-05b5ca505835 service nova] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Received unexpected event network-vif-plugged-e14da2ca-620f-4b82-abe6-2aaf88d0a8a4 for instance with vm_state building and task_state spawning. [ 1320.113414] env[68906]: DEBUG nova.compute.manager [req-1124299a-e4ed-45bd-815e-15e2312813ea req-96b8b1cc-77fb-41ee-a96d-05b5ca505835 service nova] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Received event network-changed-e14da2ca-620f-4b82-abe6-2aaf88d0a8a4 {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1320.113568] env[68906]: DEBUG nova.compute.manager [req-1124299a-e4ed-45bd-815e-15e2312813ea req-96b8b1cc-77fb-41ee-a96d-05b5ca505835 service nova] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Refreshing instance network info cache due to event network-changed-e14da2ca-620f-4b82-abe6-2aaf88d0a8a4. {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1320.113732] env[68906]: DEBUG oslo_concurrency.lockutils [req-1124299a-e4ed-45bd-815e-15e2312813ea req-96b8b1cc-77fb-41ee-a96d-05b5ca505835 service nova] Acquiring lock "refresh_cache-7466df8a-59a9-49b9-bff7-c4efbeae3eee" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1320.137798] env[68906]: DEBUG nova.network.neutron [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Updating instance_info_cache with network_info: [{"id": "e14da2ca-620f-4b82-abe6-2aaf88d0a8a4", "address": "fa:16:3e:6e:c7:46", "network": {"id": "177abda0-3a78-4d8d-b075-aba3e498fdc6", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-826793290-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e497eae9d645470590700225f0cc0e1f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c0d5204b-f60e-4830-84c8-2fe246c28202", "external-id": "nsx-vlan-transportzone-104", "segmentation_id": 104, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape14da2ca-62", "ovs_interfaceid": "e14da2ca-620f-4b82-abe6-2aaf88d0a8a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1320.152020] env[68906]: DEBUG oslo_concurrency.lockutils [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Releasing lock "refresh_cache-7466df8a-59a9-49b9-bff7-c4efbeae3eee" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1320.152020] env[68906]: DEBUG nova.compute.manager [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Instance network_info: |[{"id": "e14da2ca-620f-4b82-abe6-2aaf88d0a8a4", "address": "fa:16:3e:6e:c7:46", "network": {"id": "177abda0-3a78-4d8d-b075-aba3e498fdc6", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-826793290-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e497eae9d645470590700225f0cc0e1f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c0d5204b-f60e-4830-84c8-2fe246c28202", "external-id": "nsx-vlan-transportzone-104", "segmentation_id": 104, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape14da2ca-62", "ovs_interfaceid": "e14da2ca-620f-4b82-abe6-2aaf88d0a8a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1320.152220] env[68906]: DEBUG oslo_concurrency.lockutils [req-1124299a-e4ed-45bd-815e-15e2312813ea req-96b8b1cc-77fb-41ee-a96d-05b5ca505835 service nova] Acquired lock "refresh_cache-7466df8a-59a9-49b9-bff7-c4efbeae3eee" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1320.152220] env[68906]: DEBUG nova.network.neutron [req-1124299a-e4ed-45bd-815e-15e2312813ea req-96b8b1cc-77fb-41ee-a96d-05b5ca505835 service nova] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Refreshing network info cache for port e14da2ca-620f-4b82-abe6-2aaf88d0a8a4 {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1320.152220] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:6e:c7:46', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'c0d5204b-f60e-4830-84c8-2fe246c28202', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'e14da2ca-620f-4b82-abe6-2aaf88d0a8a4', 'vif_model': 'vmxnet3'}] {{(pid=68906) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1320.159720] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Creating folder: Project (e497eae9d645470590700225f0cc0e1f). Parent ref: group-v694750. {{(pid=68906) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1320.160721] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-28e07cfd-2549-42d2-9ad6-42a49b5fcb02 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1320.177017] env[68906]: INFO nova.virt.vmwareapi.vm_util [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Created folder: Project (e497eae9d645470590700225f0cc0e1f) in parent group-v694750. [ 1320.177017] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Creating folder: Instances. Parent ref: group-v694825. {{(pid=68906) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1320.177017] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-853cc886-d44d-48b9-98bf-d39e5026d187 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1320.184066] env[68906]: INFO nova.virt.vmwareapi.vm_util [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Created folder: Instances in parent group-v694825. [ 1320.184307] env[68906]: DEBUG oslo.service.loopingcall [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1320.184486] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Creating VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1320.184684] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-58bdcd36-c2bf-42a5-8fcb-79189b1979dc {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1320.206780] env[68906]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1320.206780] env[68906]: value = "task-3475385" [ 1320.206780] env[68906]: _type = "Task" [ 1320.206780] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1320.214795] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475385, 'name': CreateVM_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1320.417312] env[68906]: DEBUG nova.network.neutron [req-1124299a-e4ed-45bd-815e-15e2312813ea req-96b8b1cc-77fb-41ee-a96d-05b5ca505835 service nova] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Updated VIF entry in instance network info cache for port e14da2ca-620f-4b82-abe6-2aaf88d0a8a4. {{(pid=68906) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1320.417760] env[68906]: DEBUG nova.network.neutron [req-1124299a-e4ed-45bd-815e-15e2312813ea req-96b8b1cc-77fb-41ee-a96d-05b5ca505835 service nova] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Updating instance_info_cache with network_info: [{"id": "e14da2ca-620f-4b82-abe6-2aaf88d0a8a4", "address": "fa:16:3e:6e:c7:46", "network": {"id": "177abda0-3a78-4d8d-b075-aba3e498fdc6", "bridge": "br-int", "label": "tempest-ServerGroupTestJSON-826793290-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e497eae9d645470590700225f0cc0e1f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c0d5204b-f60e-4830-84c8-2fe246c28202", "external-id": "nsx-vlan-transportzone-104", "segmentation_id": 104, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape14da2ca-62", "ovs_interfaceid": "e14da2ca-620f-4b82-abe6-2aaf88d0a8a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1320.427765] env[68906]: DEBUG oslo_concurrency.lockutils [req-1124299a-e4ed-45bd-815e-15e2312813ea req-96b8b1cc-77fb-41ee-a96d-05b5ca505835 service nova] Releasing lock "refresh_cache-7466df8a-59a9-49b9-bff7-c4efbeae3eee" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1320.716273] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475385, 'name': CreateVM_Task, 'duration_secs': 0.324839} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1320.716445] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Created VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1320.717100] env[68906]: DEBUG oslo_concurrency.lockutils [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1320.717280] env[68906]: DEBUG oslo_concurrency.lockutils [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1320.717585] env[68906]: DEBUG oslo_concurrency.lockutils [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1320.717825] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e2e4cf8e-fa43-4c78-88ef-062d2abc1059 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1320.721953] env[68906]: DEBUG oslo_vmware.api [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Waiting for the task: (returnval){ [ 1320.721953] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]52f4547a-ec05-4740-75d9-b9633bbcb8b5" [ 1320.721953] env[68906]: _type = "Task" [ 1320.721953] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1320.729401] env[68906]: DEBUG oslo_vmware.api [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]52f4547a-ec05-4740-75d9-b9633bbcb8b5, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1321.233726] env[68906]: DEBUG oslo_concurrency.lockutils [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1321.233997] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Processing image b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1321.234208] env[68906]: DEBUG oslo_concurrency.lockutils [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1330.858726] env[68906]: DEBUG oslo_concurrency.lockutils [None req-d161b1c7-2d71-466f-b59f-1fcc08f7ea0b tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Acquiring lock "7466df8a-59a9-49b9-bff7-c4efbeae3eee" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1359.863605] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1363.140364] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1363.140695] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Starting heal instance info cache {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1363.140695] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Rebuilding the list of instances to heal {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1363.162434] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: acc11633-a489-4d8f-ad76-f17049a91545] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1363.163094] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1363.163094] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1363.163094] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1363.163094] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1363.163094] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: db011373-7285-4882-8bce-d39cfa22fe80] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1363.163454] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1363.163454] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1363.163454] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1363.163552] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1363.163655] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Didn't find any instances for network info cache update. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1363.164178] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1363.164353] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1363.164488] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68906) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1364.141191] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1365.282758] env[68906]: WARNING oslo_vmware.rw_handles [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1365.282758] env[68906]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1365.282758] env[68906]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1365.282758] env[68906]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1365.282758] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1365.282758] env[68906]: ERROR oslo_vmware.rw_handles response.begin() [ 1365.282758] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1365.282758] env[68906]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1365.282758] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1365.282758] env[68906]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1365.282758] env[68906]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1365.282758] env[68906]: ERROR oslo_vmware.rw_handles [ 1365.283381] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] [instance: acc11633-a489-4d8f-ad76-f17049a91545] Downloaded image file data b1400c31-d33b-4e13-944f-4c645e62493e to vmware_temp/dd01627f-54f8-4add-8675-5497474dcf23/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1365.285228] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] [instance: acc11633-a489-4d8f-ad76-f17049a91545] Caching image {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1365.285476] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Copying Virtual Disk [datastore2] vmware_temp/dd01627f-54f8-4add-8675-5497474dcf23/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk to [datastore2] vmware_temp/dd01627f-54f8-4add-8675-5497474dcf23/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk {{(pid=68906) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1365.285757] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-85c2aa8b-7219-4e56-8979-d9df98c586d0 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1365.294744] env[68906]: DEBUG oslo_vmware.api [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Waiting for the task: (returnval){ [ 1365.294744] env[68906]: value = "task-3475386" [ 1365.294744] env[68906]: _type = "Task" [ 1365.294744] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1365.302468] env[68906]: DEBUG oslo_vmware.api [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Task: {'id': task-3475386, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1365.804884] env[68906]: DEBUG oslo_vmware.exceptions [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Fault InvalidArgument not matched. {{(pid=68906) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1365.805152] env[68906]: DEBUG oslo_concurrency.lockutils [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1365.805734] env[68906]: ERROR nova.compute.manager [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] [instance: acc11633-a489-4d8f-ad76-f17049a91545] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1365.805734] env[68906]: Faults: ['InvalidArgument'] [ 1365.805734] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] Traceback (most recent call last): [ 1365.805734] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1365.805734] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] yield resources [ 1365.805734] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1365.805734] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] self.driver.spawn(context, instance, image_meta, [ 1365.805734] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1365.805734] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1365.805734] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1365.805734] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] self._fetch_image_if_missing(context, vi) [ 1365.805734] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1365.806148] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] image_cache(vi, tmp_image_ds_loc) [ 1365.806148] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1365.806148] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] vm_util.copy_virtual_disk( [ 1365.806148] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1365.806148] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] session._wait_for_task(vmdk_copy_task) [ 1365.806148] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1365.806148] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] return self.wait_for_task(task_ref) [ 1365.806148] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1365.806148] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] return evt.wait() [ 1365.806148] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1365.806148] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] result = hub.switch() [ 1365.806148] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1365.806148] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] return self.greenlet.switch() [ 1365.806468] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1365.806468] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] self.f(*self.args, **self.kw) [ 1365.806468] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1365.806468] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] raise exceptions.translate_fault(task_info.error) [ 1365.806468] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1365.806468] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] Faults: ['InvalidArgument'] [ 1365.806468] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] [ 1365.806468] env[68906]: INFO nova.compute.manager [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] [instance: acc11633-a489-4d8f-ad76-f17049a91545] Terminating instance [ 1365.807580] env[68906]: DEBUG oslo_concurrency.lockutils [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1365.807794] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1365.808046] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5a9a9c68-512c-4360-a7c7-b1a002b863f8 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1365.810238] env[68906]: DEBUG nova.compute.manager [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] [instance: acc11633-a489-4d8f-ad76-f17049a91545] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1365.810432] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] [instance: acc11633-a489-4d8f-ad76-f17049a91545] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1365.811218] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a172a63c-60d6-48d2-8bdb-b9cfef9f1c16 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1365.818295] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] [instance: acc11633-a489-4d8f-ad76-f17049a91545] Unregistering the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1365.819284] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-162dd8bf-b8e8-454a-8d7e-b09acd28ab2e {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1365.820701] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1365.820863] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68906) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1365.821559] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-706163f5-e104-4fbc-b104-a554623d960b {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1365.826775] env[68906]: DEBUG oslo_vmware.api [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Waiting for the task: (returnval){ [ 1365.826775] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]52bce358-78ec-4eef-897c-40c630ace7cd" [ 1365.826775] env[68906]: _type = "Task" [ 1365.826775] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1365.834389] env[68906]: DEBUG oslo_vmware.api [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]52bce358-78ec-4eef-897c-40c630ace7cd, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1365.891347] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] [instance: acc11633-a489-4d8f-ad76-f17049a91545] Unregistered the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1365.891573] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] [instance: acc11633-a489-4d8f-ad76-f17049a91545] Deleting contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1365.891755] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Deleting the datastore file [datastore2] acc11633-a489-4d8f-ad76-f17049a91545 {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1365.892037] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-b8da455a-68f5-41f2-9c28-740293e3f4d6 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1365.898670] env[68906]: DEBUG oslo_vmware.api [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Waiting for the task: (returnval){ [ 1365.898670] env[68906]: value = "task-3475388" [ 1365.898670] env[68906]: _type = "Task" [ 1365.898670] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1365.906702] env[68906]: DEBUG oslo_vmware.api [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Task: {'id': task-3475388, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1366.337873] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Preparing fetch location {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1366.338209] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Creating directory with path [datastore2] vmware_temp/8b3d9d67-80e6-4d1e-9a99-8c2390ef71a5/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1366.338360] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a128887f-ce5b-43db-8ef5-a3a47c7bcd6d {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1366.368547] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Created directory with path [datastore2] vmware_temp/8b3d9d67-80e6-4d1e-9a99-8c2390ef71a5/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1366.368737] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Fetch image to [datastore2] vmware_temp/8b3d9d67-80e6-4d1e-9a99-8c2390ef71a5/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1366.368907] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to [datastore2] vmware_temp/8b3d9d67-80e6-4d1e-9a99-8c2390ef71a5/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1366.369682] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-51945316-575b-4801-aecb-5de9e4b85691 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1366.375892] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d03125b8-59b5-4890-952b-e6489f4134b1 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1366.385673] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-968596e1-6ef6-4194-b4d4-7ce0abdb0884 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1366.418236] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1d6d68a-4b4b-4de3-93c0-3faae3544afe {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1366.424826] env[68906]: DEBUG oslo_vmware.api [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Task: {'id': task-3475388, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.081501} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1366.426255] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Deleted the datastore file {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1366.426485] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] [instance: acc11633-a489-4d8f-ad76-f17049a91545] Deleted contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1366.426708] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] [instance: acc11633-a489-4d8f-ad76-f17049a91545] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1366.426935] env[68906]: INFO nova.compute.manager [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] [instance: acc11633-a489-4d8f-ad76-f17049a91545] Took 0.62 seconds to destroy the instance on the hypervisor. [ 1366.428805] env[68906]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-0c1fa992-bd2a-4f34-ac40-c3bab292c392 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1366.430711] env[68906]: DEBUG nova.compute.claims [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] [instance: acc11633-a489-4d8f-ad76-f17049a91545] Aborting claim: {{(pid=68906) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1366.430924] env[68906]: DEBUG oslo_concurrency.lockutils [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1366.431199] env[68906]: DEBUG oslo_concurrency.lockutils [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1366.453837] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1366.504399] env[68906]: DEBUG oslo_vmware.rw_handles [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/8b3d9d67-80e6-4d1e-9a99-8c2390ef71a5/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1366.565278] env[68906]: DEBUG oslo_vmware.rw_handles [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Completed reading data from the image iterator. {{(pid=68906) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1366.566043] env[68906]: DEBUG oslo_vmware.rw_handles [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/8b3d9d67-80e6-4d1e-9a99-8c2390ef71a5/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1366.804443] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-70a5c35e-3a7e-487f-9a8b-05b9e1daa277 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1366.812420] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-308c67f4-ea5f-4bdf-b341-6af8170907ae {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1366.844745] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-780c94b5-9fb5-49cd-93b8-9feb2c544198 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1366.851937] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ea6f10a6-1397-4822-8bdc-319746ef7878 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1366.865360] env[68906]: DEBUG nova.compute.provider_tree [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1366.874403] env[68906]: DEBUG nova.scheduler.client.report [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1366.889445] env[68906]: DEBUG oslo_concurrency.lockutils [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.458s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1366.889984] env[68906]: ERROR nova.compute.manager [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] [instance: acc11633-a489-4d8f-ad76-f17049a91545] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1366.889984] env[68906]: Faults: ['InvalidArgument'] [ 1366.889984] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] Traceback (most recent call last): [ 1366.889984] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1366.889984] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] self.driver.spawn(context, instance, image_meta, [ 1366.889984] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1366.889984] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1366.889984] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1366.889984] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] self._fetch_image_if_missing(context, vi) [ 1366.889984] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1366.889984] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] image_cache(vi, tmp_image_ds_loc) [ 1366.889984] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1366.890348] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] vm_util.copy_virtual_disk( [ 1366.890348] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1366.890348] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] session._wait_for_task(vmdk_copy_task) [ 1366.890348] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1366.890348] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] return self.wait_for_task(task_ref) [ 1366.890348] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1366.890348] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] return evt.wait() [ 1366.890348] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1366.890348] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] result = hub.switch() [ 1366.890348] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1366.890348] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] return self.greenlet.switch() [ 1366.890348] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1366.890348] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] self.f(*self.args, **self.kw) [ 1366.890749] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1366.890749] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] raise exceptions.translate_fault(task_info.error) [ 1366.890749] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1366.890749] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] Faults: ['InvalidArgument'] [ 1366.890749] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] [ 1366.890749] env[68906]: DEBUG nova.compute.utils [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] [instance: acc11633-a489-4d8f-ad76-f17049a91545] VimFaultException {{(pid=68906) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1366.892205] env[68906]: DEBUG nova.compute.manager [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] [instance: acc11633-a489-4d8f-ad76-f17049a91545] Build of instance acc11633-a489-4d8f-ad76-f17049a91545 was re-scheduled: A specified parameter was not correct: fileType [ 1366.892205] env[68906]: Faults: ['InvalidArgument'] {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1366.892579] env[68906]: DEBUG nova.compute.manager [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] [instance: acc11633-a489-4d8f-ad76-f17049a91545] Unplugging VIFs for instance {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1366.892756] env[68906]: DEBUG nova.compute.manager [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1366.892912] env[68906]: DEBUG nova.compute.manager [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] [instance: acc11633-a489-4d8f-ad76-f17049a91545] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1366.893091] env[68906]: DEBUG nova.network.neutron [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] [instance: acc11633-a489-4d8f-ad76-f17049a91545] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1367.023053] env[68906]: DEBUG neutronclient.v2_0.client [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=68906) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1367.024118] env[68906]: ERROR nova.compute.manager [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] [instance: acc11633-a489-4d8f-ad76-f17049a91545] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1367.024118] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] Traceback (most recent call last): [ 1367.024118] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1367.024118] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] self.driver.spawn(context, instance, image_meta, [ 1367.024118] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1367.024118] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1367.024118] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1367.024118] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] self._fetch_image_if_missing(context, vi) [ 1367.024118] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1367.024118] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] image_cache(vi, tmp_image_ds_loc) [ 1367.024118] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1367.024118] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] vm_util.copy_virtual_disk( [ 1367.024527] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1367.024527] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] session._wait_for_task(vmdk_copy_task) [ 1367.024527] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1367.024527] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] return self.wait_for_task(task_ref) [ 1367.024527] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1367.024527] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] return evt.wait() [ 1367.024527] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1367.024527] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] result = hub.switch() [ 1367.024527] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1367.024527] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] return self.greenlet.switch() [ 1367.024527] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1367.024527] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] self.f(*self.args, **self.kw) [ 1367.024527] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1367.024961] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] raise exceptions.translate_fault(task_info.error) [ 1367.024961] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1367.024961] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] Faults: ['InvalidArgument'] [ 1367.024961] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] [ 1367.024961] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] During handling of the above exception, another exception occurred: [ 1367.024961] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] [ 1367.024961] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] Traceback (most recent call last): [ 1367.024961] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/nova/nova/compute/manager.py", line 2430, in _do_build_and_run_instance [ 1367.024961] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] self._build_and_run_instance(context, instance, image, [ 1367.024961] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/nova/nova/compute/manager.py", line 2722, in _build_and_run_instance [ 1367.024961] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] raise exception.RescheduledException( [ 1367.024961] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] nova.exception.RescheduledException: Build of instance acc11633-a489-4d8f-ad76-f17049a91545 was re-scheduled: A specified parameter was not correct: fileType [ 1367.024961] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] Faults: ['InvalidArgument'] [ 1367.024961] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] [ 1367.025381] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] During handling of the above exception, another exception occurred: [ 1367.025381] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] [ 1367.025381] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] Traceback (most recent call last): [ 1367.025381] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1367.025381] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] ret = obj(*args, **kwargs) [ 1367.025381] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1367.025381] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] exception_handler_v20(status_code, error_body) [ 1367.025381] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1367.025381] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] raise client_exc(message=error_message, [ 1367.025381] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1367.025381] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] Neutron server returns request_ids: ['req-6c0be6ad-e9e9-42dd-97b0-ee600dde1103'] [ 1367.025381] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] [ 1367.025381] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] During handling of the above exception, another exception occurred: [ 1367.025723] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] [ 1367.025723] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] Traceback (most recent call last): [ 1367.025723] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/nova/nova/compute/manager.py", line 3019, in _cleanup_allocated_networks [ 1367.025723] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] self._deallocate_network(context, instance, requested_networks) [ 1367.025723] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1367.025723] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] self.network_api.deallocate_for_instance( [ 1367.025723] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1367.025723] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] data = neutron.list_ports(**search_opts) [ 1367.025723] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1367.025723] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] ret = obj(*args, **kwargs) [ 1367.025723] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1367.025723] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] return self.list('ports', self.ports_path, retrieve_all, [ 1367.025723] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1367.026114] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] ret = obj(*args, **kwargs) [ 1367.026114] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1367.026114] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] for r in self._pagination(collection, path, **params): [ 1367.026114] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1367.026114] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] res = self.get(path, params=params) [ 1367.026114] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1367.026114] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] ret = obj(*args, **kwargs) [ 1367.026114] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1367.026114] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] return self.retry_request("GET", action, body=body, [ 1367.026114] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1367.026114] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] ret = obj(*args, **kwargs) [ 1367.026114] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1367.026114] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] return self.do_request(method, action, body=body, [ 1367.026443] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1367.026443] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] ret = obj(*args, **kwargs) [ 1367.026443] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1367.026443] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] self._handle_fault_response(status_code, replybody, resp) [ 1367.026443] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1367.026443] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] raise exception.Unauthorized() [ 1367.026443] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] nova.exception.Unauthorized: Not authorized. [ 1367.026443] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] [ 1367.076535] env[68906]: INFO nova.scheduler.client.report [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Deleted allocations for instance acc11633-a489-4d8f-ad76-f17049a91545 [ 1367.095882] env[68906]: DEBUG oslo_concurrency.lockutils [None req-86e3143e-5477-482e-8022-d48225e10873 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Lock "acc11633-a489-4d8f-ad76-f17049a91545" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 624.744s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1367.097014] env[68906]: DEBUG oslo_concurrency.lockutils [None req-43520988-2e5c-4ea6-8d31-9e7f370b0b01 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Lock "acc11633-a489-4d8f-ad76-f17049a91545" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 428.315s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1367.097250] env[68906]: DEBUG oslo_concurrency.lockutils [None req-43520988-2e5c-4ea6-8d31-9e7f370b0b01 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Acquiring lock "acc11633-a489-4d8f-ad76-f17049a91545-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1367.097453] env[68906]: DEBUG oslo_concurrency.lockutils [None req-43520988-2e5c-4ea6-8d31-9e7f370b0b01 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Lock "acc11633-a489-4d8f-ad76-f17049a91545-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1367.097628] env[68906]: DEBUG oslo_concurrency.lockutils [None req-43520988-2e5c-4ea6-8d31-9e7f370b0b01 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Lock "acc11633-a489-4d8f-ad76-f17049a91545-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1367.099592] env[68906]: INFO nova.compute.manager [None req-43520988-2e5c-4ea6-8d31-9e7f370b0b01 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] [instance: acc11633-a489-4d8f-ad76-f17049a91545] Terminating instance [ 1367.101274] env[68906]: DEBUG nova.compute.manager [None req-43520988-2e5c-4ea6-8d31-9e7f370b0b01 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] [instance: acc11633-a489-4d8f-ad76-f17049a91545] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1367.101467] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-43520988-2e5c-4ea6-8d31-9e7f370b0b01 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] [instance: acc11633-a489-4d8f-ad76-f17049a91545] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1367.101919] env[68906]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-d5c5b0cf-8b7d-414e-8f22-587f7e3aca33 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1367.107115] env[68906]: DEBUG nova.compute.manager [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: 89171680-c76d-4826-9236-379542661ffb] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1367.113712] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-65668709-c285-4d88-bd44-232ac4b3e5a2 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1367.142238] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1367.143107] env[68906]: WARNING nova.virt.vmwareapi.vmops [None req-43520988-2e5c-4ea6-8d31-9e7f370b0b01 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] [instance: acc11633-a489-4d8f-ad76-f17049a91545] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance acc11633-a489-4d8f-ad76-f17049a91545 could not be found. [ 1367.143247] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-43520988-2e5c-4ea6-8d31-9e7f370b0b01 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] [instance: acc11633-a489-4d8f-ad76-f17049a91545] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1367.143486] env[68906]: INFO nova.compute.manager [None req-43520988-2e5c-4ea6-8d31-9e7f370b0b01 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] [instance: acc11633-a489-4d8f-ad76-f17049a91545] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1367.143743] env[68906]: DEBUG oslo.service.loopingcall [None req-43520988-2e5c-4ea6-8d31-9e7f370b0b01 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1367.144568] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1367.144762] env[68906]: DEBUG nova.compute.manager [-] [instance: acc11633-a489-4d8f-ad76-f17049a91545] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1367.144863] env[68906]: DEBUG nova.network.neutron [-] [instance: acc11633-a489-4d8f-ad76-f17049a91545] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1367.160644] env[68906]: DEBUG oslo_concurrency.lockutils [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1367.160912] env[68906]: DEBUG oslo_concurrency.lockutils [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1367.162401] env[68906]: INFO nova.compute.claims [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: 89171680-c76d-4826-9236-379542661ffb] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1367.237251] env[68906]: DEBUG neutronclient.v2_0.client [-] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=68906) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1367.237913] env[68906]: ERROR nova.network.neutron [-] Neutron client was not able to generate a valid admin token, please verify Neutron admin credential located in nova.conf: neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1367.237913] env[68906]: ERROR oslo.service.loopingcall [-] Dynamic interval looping call 'oslo_service.loopingcall.RetryDecorator.__call__.._func' failed: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1367.237913] env[68906]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1367.237913] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1367.237913] env[68906]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1367.237913] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1367.237913] env[68906]: ERROR oslo.service.loopingcall exception_handler_v20(status_code, error_body) [ 1367.237913] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1367.237913] env[68906]: ERROR oslo.service.loopingcall raise client_exc(message=error_message, [ 1367.237913] env[68906]: ERROR oslo.service.loopingcall neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1367.237913] env[68906]: ERROR oslo.service.loopingcall Neutron server returns request_ids: ['req-d620983a-3d70-4c41-bb08-fc9818e7ea4a'] [ 1367.237913] env[68906]: ERROR oslo.service.loopingcall [ 1367.237913] env[68906]: ERROR oslo.service.loopingcall During handling of the above exception, another exception occurred: [ 1367.237913] env[68906]: ERROR oslo.service.loopingcall [ 1367.238569] env[68906]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1367.238569] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1367.238569] env[68906]: ERROR oslo.service.loopingcall result = func(*self.args, **self.kw) [ 1367.238569] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1367.238569] env[68906]: ERROR oslo.service.loopingcall result = f(*args, **kwargs) [ 1367.238569] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1367.238569] env[68906]: ERROR oslo.service.loopingcall self._deallocate_network( [ 1367.238569] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1367.238569] env[68906]: ERROR oslo.service.loopingcall self.network_api.deallocate_for_instance( [ 1367.238569] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1367.238569] env[68906]: ERROR oslo.service.loopingcall data = neutron.list_ports(**search_opts) [ 1367.238569] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1367.238569] env[68906]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1367.238569] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1367.238569] env[68906]: ERROR oslo.service.loopingcall return self.list('ports', self.ports_path, retrieve_all, [ 1367.238569] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1367.238569] env[68906]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1367.238569] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1367.239089] env[68906]: ERROR oslo.service.loopingcall for r in self._pagination(collection, path, **params): [ 1367.239089] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1367.239089] env[68906]: ERROR oslo.service.loopingcall res = self.get(path, params=params) [ 1367.239089] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1367.239089] env[68906]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1367.239089] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1367.239089] env[68906]: ERROR oslo.service.loopingcall return self.retry_request("GET", action, body=body, [ 1367.239089] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1367.239089] env[68906]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1367.239089] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1367.239089] env[68906]: ERROR oslo.service.loopingcall return self.do_request(method, action, body=body, [ 1367.239089] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1367.239089] env[68906]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1367.239089] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1367.239089] env[68906]: ERROR oslo.service.loopingcall self._handle_fault_response(status_code, replybody, resp) [ 1367.239089] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1367.239089] env[68906]: ERROR oslo.service.loopingcall raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1367.239610] env[68906]: ERROR oslo.service.loopingcall nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1367.239610] env[68906]: ERROR oslo.service.loopingcall [ 1367.239610] env[68906]: ERROR nova.compute.manager [None req-43520988-2e5c-4ea6-8d31-9e7f370b0b01 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] [instance: acc11633-a489-4d8f-ad76-f17049a91545] Failed to deallocate network for instance. Error: Networking client is experiencing an unauthorized exception.: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1367.268837] env[68906]: ERROR nova.compute.manager [None req-43520988-2e5c-4ea6-8d31-9e7f370b0b01 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] [instance: acc11633-a489-4d8f-ad76-f17049a91545] Setting instance vm_state to ERROR: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1367.268837] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] Traceback (most recent call last): [ 1367.268837] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1367.268837] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] ret = obj(*args, **kwargs) [ 1367.268837] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1367.268837] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] exception_handler_v20(status_code, error_body) [ 1367.268837] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1367.268837] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] raise client_exc(message=error_message, [ 1367.268837] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1367.268837] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] Neutron server returns request_ids: ['req-d620983a-3d70-4c41-bb08-fc9818e7ea4a'] [ 1367.268837] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] [ 1367.269131] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] During handling of the above exception, another exception occurred: [ 1367.269131] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] [ 1367.269131] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] Traceback (most recent call last): [ 1367.269131] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 1367.269131] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] self._delete_instance(context, instance, bdms) [ 1367.269131] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 1367.269131] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] self._shutdown_instance(context, instance, bdms) [ 1367.269131] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 1367.269131] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] self._try_deallocate_network(context, instance, requested_networks) [ 1367.269131] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 1367.269131] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] with excutils.save_and_reraise_exception(): [ 1367.269131] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1367.269131] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] self.force_reraise() [ 1367.269445] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1367.269445] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] raise self.value [ 1367.269445] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 1367.269445] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] _deallocate_network_with_retries() [ 1367.269445] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1367.269445] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] return evt.wait() [ 1367.269445] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1367.269445] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] result = hub.switch() [ 1367.269445] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1367.269445] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] return self.greenlet.switch() [ 1367.269445] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1367.269445] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] result = func(*self.args, **self.kw) [ 1367.269774] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1367.269774] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] result = f(*args, **kwargs) [ 1367.269774] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1367.269774] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] self._deallocate_network( [ 1367.269774] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1367.269774] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] self.network_api.deallocate_for_instance( [ 1367.269774] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1367.269774] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] data = neutron.list_ports(**search_opts) [ 1367.269774] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1367.269774] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] ret = obj(*args, **kwargs) [ 1367.269774] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1367.269774] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] return self.list('ports', self.ports_path, retrieve_all, [ 1367.269774] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1367.270122] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] ret = obj(*args, **kwargs) [ 1367.270122] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1367.270122] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] for r in self._pagination(collection, path, **params): [ 1367.270122] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1367.270122] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] res = self.get(path, params=params) [ 1367.270122] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1367.270122] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] ret = obj(*args, **kwargs) [ 1367.270122] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1367.270122] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] return self.retry_request("GET", action, body=body, [ 1367.270122] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1367.270122] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] ret = obj(*args, **kwargs) [ 1367.270122] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1367.270122] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] return self.do_request(method, action, body=body, [ 1367.270449] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1367.270449] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] ret = obj(*args, **kwargs) [ 1367.270449] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1367.270449] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] self._handle_fault_response(status_code, replybody, resp) [ 1367.270449] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1367.270449] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1367.270449] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1367.270449] env[68906]: ERROR nova.compute.manager [instance: acc11633-a489-4d8f-ad76-f17049a91545] [ 1367.297652] env[68906]: DEBUG oslo_concurrency.lockutils [None req-43520988-2e5c-4ea6-8d31-9e7f370b0b01 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Lock "acc11633-a489-4d8f-ad76-f17049a91545" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.201s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1367.300406] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "acc11633-a489-4d8f-ad76-f17049a91545" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 156.348s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1367.300611] env[68906]: INFO nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: acc11633-a489-4d8f-ad76-f17049a91545] During sync_power_state the instance has a pending task (deleting). Skip. [ 1367.300979] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "acc11633-a489-4d8f-ad76-f17049a91545" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1367.353351] env[68906]: INFO nova.compute.manager [None req-43520988-2e5c-4ea6-8d31-9e7f370b0b01 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] [instance: acc11633-a489-4d8f-ad76-f17049a91545] Successfully reverted task state from None on failure for instance. [ 1367.356755] env[68906]: ERROR oslo_messaging.rpc.server [None req-43520988-2e5c-4ea6-8d31-9e7f370b0b01 tempest-ListImageFiltersTestJSON-730025420 tempest-ListImageFiltersTestJSON-730025420-project-member] Exception during message handling: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1367.356755] env[68906]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1367.356755] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1367.356755] env[68906]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1367.356755] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1367.356755] env[68906]: ERROR oslo_messaging.rpc.server exception_handler_v20(status_code, error_body) [ 1367.356755] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1367.356755] env[68906]: ERROR oslo_messaging.rpc.server raise client_exc(message=error_message, [ 1367.356755] env[68906]: ERROR oslo_messaging.rpc.server neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1367.356755] env[68906]: ERROR oslo_messaging.rpc.server Neutron server returns request_ids: ['req-d620983a-3d70-4c41-bb08-fc9818e7ea4a'] [ 1367.356755] env[68906]: ERROR oslo_messaging.rpc.server [ 1367.356755] env[68906]: ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred: [ 1367.356755] env[68906]: ERROR oslo_messaging.rpc.server [ 1367.356755] env[68906]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1367.356755] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming [ 1367.357168] env[68906]: ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) [ 1367.357168] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch [ 1367.357168] env[68906]: ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) [ 1367.357168] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch [ 1367.357168] env[68906]: ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) [ 1367.357168] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 65, in wrapped [ 1367.357168] env[68906]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1367.357168] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1367.357168] env[68906]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1367.357168] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1367.357168] env[68906]: ERROR oslo_messaging.rpc.server raise self.value [ 1367.357168] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 63, in wrapped [ 1367.357168] env[68906]: ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) [ 1367.357168] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 166, in decorated_function [ 1367.357168] env[68906]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1367.357168] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1367.357168] env[68906]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1367.357168] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1367.357679] env[68906]: ERROR oslo_messaging.rpc.server raise self.value [ 1367.357679] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 157, in decorated_function [ 1367.357679] env[68906]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1367.357679] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/utils.py", line 1439, in decorated_function [ 1367.357679] env[68906]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1367.357679] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 213, in decorated_function [ 1367.357679] env[68906]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1367.357679] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1367.357679] env[68906]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1367.357679] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1367.357679] env[68906]: ERROR oslo_messaging.rpc.server raise self.value [ 1367.357679] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 203, in decorated_function [ 1367.357679] env[68906]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1367.357679] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3327, in terminate_instance [ 1367.357679] env[68906]: ERROR oslo_messaging.rpc.server do_terminate_instance(instance, bdms) [ 1367.357679] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1367.357679] env[68906]: ERROR oslo_messaging.rpc.server return f(*args, **kwargs) [ 1367.357679] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3322, in do_terminate_instance [ 1367.358172] env[68906]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1367.358172] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1367.358172] env[68906]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1367.358172] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1367.358172] env[68906]: ERROR oslo_messaging.rpc.server raise self.value [ 1367.358172] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 1367.358172] env[68906]: ERROR oslo_messaging.rpc.server self._delete_instance(context, instance, bdms) [ 1367.358172] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 1367.358172] env[68906]: ERROR oslo_messaging.rpc.server self._shutdown_instance(context, instance, bdms) [ 1367.358172] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 1367.358172] env[68906]: ERROR oslo_messaging.rpc.server self._try_deallocate_network(context, instance, requested_networks) [ 1367.358172] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 1367.358172] env[68906]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1367.358172] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1367.358172] env[68906]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1367.358172] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1367.358172] env[68906]: ERROR oslo_messaging.rpc.server raise self.value [ 1367.358172] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 1367.358623] env[68906]: ERROR oslo_messaging.rpc.server _deallocate_network_with_retries() [ 1367.358623] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1367.358623] env[68906]: ERROR oslo_messaging.rpc.server return evt.wait() [ 1367.358623] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1367.358623] env[68906]: ERROR oslo_messaging.rpc.server result = hub.switch() [ 1367.358623] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1367.358623] env[68906]: ERROR oslo_messaging.rpc.server return self.greenlet.switch() [ 1367.358623] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1367.358623] env[68906]: ERROR oslo_messaging.rpc.server result = func(*self.args, **self.kw) [ 1367.358623] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1367.358623] env[68906]: ERROR oslo_messaging.rpc.server result = f(*args, **kwargs) [ 1367.358623] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1367.358623] env[68906]: ERROR oslo_messaging.rpc.server self._deallocate_network( [ 1367.358623] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1367.358623] env[68906]: ERROR oslo_messaging.rpc.server self.network_api.deallocate_for_instance( [ 1367.358623] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1367.358623] env[68906]: ERROR oslo_messaging.rpc.server data = neutron.list_ports(**search_opts) [ 1367.358623] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1367.359098] env[68906]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1367.359098] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1367.359098] env[68906]: ERROR oslo_messaging.rpc.server return self.list('ports', self.ports_path, retrieve_all, [ 1367.359098] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1367.359098] env[68906]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1367.359098] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1367.359098] env[68906]: ERROR oslo_messaging.rpc.server for r in self._pagination(collection, path, **params): [ 1367.359098] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1367.359098] env[68906]: ERROR oslo_messaging.rpc.server res = self.get(path, params=params) [ 1367.359098] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1367.359098] env[68906]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1367.359098] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1367.359098] env[68906]: ERROR oslo_messaging.rpc.server return self.retry_request("GET", action, body=body, [ 1367.359098] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1367.359098] env[68906]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1367.359098] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1367.359098] env[68906]: ERROR oslo_messaging.rpc.server return self.do_request(method, action, body=body, [ 1367.359098] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1367.359557] env[68906]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1367.359557] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1367.359557] env[68906]: ERROR oslo_messaging.rpc.server self._handle_fault_response(status_code, replybody, resp) [ 1367.359557] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1367.359557] env[68906]: ERROR oslo_messaging.rpc.server raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1367.359557] env[68906]: ERROR oslo_messaging.rpc.server nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1367.359557] env[68906]: ERROR oslo_messaging.rpc.server [ 1367.447636] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-df735ee1-b06f-4b65-9848-3a10e00aa23c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1367.455449] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-14029ca1-e621-4f81-a576-a69ff78502ba {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1367.484561] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-306f8cff-3ef6-42cc-bc13-bdb79149dd3f {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1367.491724] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-11997459-ceb7-4e4d-8b54-d7ec9c029c67 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1367.504438] env[68906]: DEBUG nova.compute.provider_tree [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1367.514121] env[68906]: DEBUG nova.scheduler.client.report [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1367.527285] env[68906]: DEBUG oslo_concurrency.lockutils [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.366s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1367.527744] env[68906]: DEBUG nova.compute.manager [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: 89171680-c76d-4826-9236-379542661ffb] Start building networks asynchronously for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1367.559962] env[68906]: DEBUG nova.compute.utils [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Using /dev/sd instead of None {{(pid=68906) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1367.561466] env[68906]: DEBUG nova.compute.manager [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: 89171680-c76d-4826-9236-379542661ffb] Allocating IP information in the background. {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1367.561648] env[68906]: DEBUG nova.network.neutron [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: 89171680-c76d-4826-9236-379542661ffb] allocate_for_instance() {{(pid=68906) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1367.570032] env[68906]: DEBUG nova.compute.manager [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: 89171680-c76d-4826-9236-379542661ffb] Start building block device mappings for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1367.624046] env[68906]: DEBUG nova.policy [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b46c06fcd3404f45abc083563415467b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'da1df204e7064662bf5c15a1598c0d4e', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68906) authorize /opt/stack/nova/nova/policy.py:203}} [ 1367.641304] env[68906]: DEBUG nova.compute.manager [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: 89171680-c76d-4826-9236-379542661ffb] Start spawning the instance on the hypervisor. {{(pid=68906) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1367.669866] env[68906]: DEBUG nova.virt.hardware [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T13:00:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T13:00:23Z,direct_url=,disk_format='vmdk',id=b1400c31-d33b-4e13-944f-4c645e62493e,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='1ae7bf3a375d41c6af5e7536af51ffd1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T13:00:24Z,virtual_size=,visibility=), allow threads: False {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1367.670138] env[68906]: DEBUG nova.virt.hardware [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Flavor limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1367.670297] env[68906]: DEBUG nova.virt.hardware [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Image limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1367.670478] env[68906]: DEBUG nova.virt.hardware [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Flavor pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1367.670672] env[68906]: DEBUG nova.virt.hardware [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Image pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1367.670779] env[68906]: DEBUG nova.virt.hardware [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1367.670984] env[68906]: DEBUG nova.virt.hardware [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1367.671157] env[68906]: DEBUG nova.virt.hardware [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1367.671325] env[68906]: DEBUG nova.virt.hardware [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Got 1 possible topologies {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1367.671487] env[68906]: DEBUG nova.virt.hardware [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1367.671659] env[68906]: DEBUG nova.virt.hardware [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1367.672601] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-432eea2a-781e-497e-a4ed-215b16932833 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1367.681088] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6fe9fbb2-7868-4143-af13-a50aa5e9d785 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1368.017521] env[68906]: DEBUG nova.network.neutron [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: 89171680-c76d-4826-9236-379542661ffb] Successfully created port: 53e6f042-9c5f-4263-881b-8f67109a8f72 {{(pid=68906) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1368.140678] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1368.141060] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager.update_available_resource {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1368.157788] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1368.158053] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1368.158230] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1368.158390] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68906) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1368.159565] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a5022250-d974-47a2-abcc-8450851fe1a9 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1368.169041] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fac91794-3c0e-4c6f-8f46-b4e52ccf8da6 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1368.184491] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f7eff4ed-ec2f-422a-8075-240b54eda85d {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1368.192166] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2be196c7-34ac-4dc7-8ab2-4e79038fb449 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1368.224015] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180976MB free_disk=93GB free_vcpus=48 pci_devices=None {{(pid=68906) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1368.224015] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1368.224152] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1368.301305] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance e7286888-d79d-4632-9c06-69c1ef47fa50 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1368.301653] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 641cca5b-d749-4331-a5e0-8acb6d47cba2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1368.301653] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1368.301819] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 4d36bb91-0cde-44cb-8706-d17740a9cf50 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1368.301819] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance db011373-7285-4882-8bce-d39cfa22fe80 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1368.301938] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 1fdb401a-ac25-4418-803c-fc0b2297f2d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1368.302090] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1368.302223] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance e0e595e3-e47e-4cf1-8977-f004eca942d1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1368.302341] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 7466df8a-59a9-49b9-bff7-c4efbeae3eee actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1368.302455] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 89171680-c76d-4826-9236-379542661ffb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1368.315992] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 9b884416-df89-4d8c-b2ab-0667db52a718 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1368.328271] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 917ba3c3-9188-40fa-be6c-cdab27b76970 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1368.339780] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 7803f951-a0c0-4246-b2d9-3eabadfa679d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1368.351238] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 8a4e18b6-55c0-4397-b570-27db4541e9b3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1368.362291] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 2b688987-d4cf-4ebb-83c4-d5fa7f5bcbb9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1368.376736] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 3ce59687-c677-40bd-8af4-c2f4b576e86e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1368.387859] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 45c0d7ba-6d21-46d1-8bcb-0318bd93f885 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1368.398617] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 75a4f8bc-09aa-4c9b-b705-fb84ddcf60ea has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1368.410890] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance ff99f1e3-9a4a-487e-afcb-6d8439a0491d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1368.421761] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance aed06616-d008-4695-b66e-9f40acf5ebd3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1368.433085] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance c3386804-6ed9-46fe-b26d-3b5aae52c84b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1368.433378] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1368.433531] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1368.577480] env[68906]: DEBUG nova.compute.manager [req-0d79f92f-c55c-403f-9447-e255c1138833 req-dfae74e5-6cde-437e-9fe1-601f2f5b6323 service nova] [instance: 89171680-c76d-4826-9236-379542661ffb] Received event network-vif-plugged-53e6f042-9c5f-4263-881b-8f67109a8f72 {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1368.577741] env[68906]: DEBUG oslo_concurrency.lockutils [req-0d79f92f-c55c-403f-9447-e255c1138833 req-dfae74e5-6cde-437e-9fe1-601f2f5b6323 service nova] Acquiring lock "89171680-c76d-4826-9236-379542661ffb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1368.577967] env[68906]: DEBUG oslo_concurrency.lockutils [req-0d79f92f-c55c-403f-9447-e255c1138833 req-dfae74e5-6cde-437e-9fe1-601f2f5b6323 service nova] Lock "89171680-c76d-4826-9236-379542661ffb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1368.578209] env[68906]: DEBUG oslo_concurrency.lockutils [req-0d79f92f-c55c-403f-9447-e255c1138833 req-dfae74e5-6cde-437e-9fe1-601f2f5b6323 service nova] Lock "89171680-c76d-4826-9236-379542661ffb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1368.578388] env[68906]: DEBUG nova.compute.manager [req-0d79f92f-c55c-403f-9447-e255c1138833 req-dfae74e5-6cde-437e-9fe1-601f2f5b6323 service nova] [instance: 89171680-c76d-4826-9236-379542661ffb] No waiting events found dispatching network-vif-plugged-53e6f042-9c5f-4263-881b-8f67109a8f72 {{(pid=68906) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1368.578554] env[68906]: WARNING nova.compute.manager [req-0d79f92f-c55c-403f-9447-e255c1138833 req-dfae74e5-6cde-437e-9fe1-601f2f5b6323 service nova] [instance: 89171680-c76d-4826-9236-379542661ffb] Received unexpected event network-vif-plugged-53e6f042-9c5f-4263-881b-8f67109a8f72 for instance with vm_state building and task_state spawning. [ 1368.661296] env[68906]: DEBUG nova.network.neutron [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: 89171680-c76d-4826-9236-379542661ffb] Successfully updated port: 53e6f042-9c5f-4263-881b-8f67109a8f72 {{(pid=68906) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1368.676557] env[68906]: DEBUG oslo_concurrency.lockutils [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Acquiring lock "refresh_cache-89171680-c76d-4826-9236-379542661ffb" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1368.676557] env[68906]: DEBUG oslo_concurrency.lockutils [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Acquired lock "refresh_cache-89171680-c76d-4826-9236-379542661ffb" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1368.676557] env[68906]: DEBUG nova.network.neutron [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: 89171680-c76d-4826-9236-379542661ffb] Building network info cache for instance {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1368.723592] env[68906]: DEBUG nova.network.neutron [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: 89171680-c76d-4826-9236-379542661ffb] Instance cache missing network info. {{(pid=68906) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1368.751421] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-124f8bfd-7f9c-41ef-83a1-91a27dec3100 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1368.762737] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-98b8f5fa-983c-43af-a69f-9756025a6911 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1368.802959] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28b6ffba-88b8-4490-ae63-3cf00582b4bf {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1368.811308] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-39584572-c411-4c51-8851-c084dafc54bd {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1368.828352] env[68906]: DEBUG nova.compute.provider_tree [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1368.837014] env[68906]: DEBUG nova.scheduler.client.report [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1368.851685] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68906) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1368.852269] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.628s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1368.937177] env[68906]: DEBUG nova.network.neutron [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: 89171680-c76d-4826-9236-379542661ffb] Updating instance_info_cache with network_info: [{"id": "53e6f042-9c5f-4263-881b-8f67109a8f72", "address": "fa:16:3e:c8:a7:1d", "network": {"id": "cbbbf860-58e5-4164-8de8-b1492ffc7605", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1183077938-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "da1df204e7064662bf5c15a1598c0d4e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6eb7e3e9-5cc2-40f1-a6eb-f70f06531667", "external-id": "nsx-vlan-transportzone-938", "segmentation_id": 938, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap53e6f042-9c", "ovs_interfaceid": "53e6f042-9c5f-4263-881b-8f67109a8f72", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1368.950791] env[68906]: DEBUG oslo_concurrency.lockutils [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Releasing lock "refresh_cache-89171680-c76d-4826-9236-379542661ffb" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1368.951145] env[68906]: DEBUG nova.compute.manager [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: 89171680-c76d-4826-9236-379542661ffb] Instance network_info: |[{"id": "53e6f042-9c5f-4263-881b-8f67109a8f72", "address": "fa:16:3e:c8:a7:1d", "network": {"id": "cbbbf860-58e5-4164-8de8-b1492ffc7605", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1183077938-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "da1df204e7064662bf5c15a1598c0d4e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6eb7e3e9-5cc2-40f1-a6eb-f70f06531667", "external-id": "nsx-vlan-transportzone-938", "segmentation_id": 938, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap53e6f042-9c", "ovs_interfaceid": "53e6f042-9c5f-4263-881b-8f67109a8f72", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1368.951553] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: 89171680-c76d-4826-9236-379542661ffb] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:c8:a7:1d', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '6eb7e3e9-5cc2-40f1-a6eb-f70f06531667', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '53e6f042-9c5f-4263-881b-8f67109a8f72', 'vif_model': 'vmxnet3'}] {{(pid=68906) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1368.959442] env[68906]: DEBUG oslo.service.loopingcall [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1368.959985] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 89171680-c76d-4826-9236-379542661ffb] Creating VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1368.960228] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-dd965b92-da22-479a-be17-f0089dbd540c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1368.981679] env[68906]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1368.981679] env[68906]: value = "task-3475389" [ 1368.981679] env[68906]: _type = "Task" [ 1368.981679] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1368.990073] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475389, 'name': CreateVM_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1369.492277] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475389, 'name': CreateVM_Task, 'duration_secs': 0.333005} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1369.492651] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 89171680-c76d-4826-9236-379542661ffb] Created VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1369.493137] env[68906]: DEBUG oslo_concurrency.lockutils [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1369.493305] env[68906]: DEBUG oslo_concurrency.lockutils [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1369.493628] env[68906]: DEBUG oslo_concurrency.lockutils [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1369.493877] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-40124dfa-53fb-4378-8d7d-a21afd9b0626 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1369.498096] env[68906]: DEBUG oslo_vmware.api [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Waiting for the task: (returnval){ [ 1369.498096] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]528bbf91-a245-1c12-765a-77c1fe3c7966" [ 1369.498096] env[68906]: _type = "Task" [ 1369.498096] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1369.505627] env[68906]: DEBUG oslo_vmware.api [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]528bbf91-a245-1c12-765a-77c1fe3c7966, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1370.008541] env[68906]: DEBUG oslo_concurrency.lockutils [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1370.008632] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: 89171680-c76d-4826-9236-379542661ffb] Processing image b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1370.008792] env[68906]: DEBUG oslo_concurrency.lockutils [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1370.604314] env[68906]: DEBUG nova.compute.manager [req-8adbb52a-aef2-4562-8361-e9b8f5311a7b req-bb1c1ac1-64e0-4a05-90c5-923f3bfd1804 service nova] [instance: 89171680-c76d-4826-9236-379542661ffb] Received event network-changed-53e6f042-9c5f-4263-881b-8f67109a8f72 {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1370.604314] env[68906]: DEBUG nova.compute.manager [req-8adbb52a-aef2-4562-8361-e9b8f5311a7b req-bb1c1ac1-64e0-4a05-90c5-923f3bfd1804 service nova] [instance: 89171680-c76d-4826-9236-379542661ffb] Refreshing instance network info cache due to event network-changed-53e6f042-9c5f-4263-881b-8f67109a8f72. {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1370.604314] env[68906]: DEBUG oslo_concurrency.lockutils [req-8adbb52a-aef2-4562-8361-e9b8f5311a7b req-bb1c1ac1-64e0-4a05-90c5-923f3bfd1804 service nova] Acquiring lock "refresh_cache-89171680-c76d-4826-9236-379542661ffb" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1370.604670] env[68906]: DEBUG oslo_concurrency.lockutils [req-8adbb52a-aef2-4562-8361-e9b8f5311a7b req-bb1c1ac1-64e0-4a05-90c5-923f3bfd1804 service nova] Acquired lock "refresh_cache-89171680-c76d-4826-9236-379542661ffb" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1370.604670] env[68906]: DEBUG nova.network.neutron [req-8adbb52a-aef2-4562-8361-e9b8f5311a7b req-bb1c1ac1-64e0-4a05-90c5-923f3bfd1804 service nova] [instance: 89171680-c76d-4826-9236-379542661ffb] Refreshing network info cache for port 53e6f042-9c5f-4263-881b-8f67109a8f72 {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1370.840832] env[68906]: DEBUG nova.network.neutron [req-8adbb52a-aef2-4562-8361-e9b8f5311a7b req-bb1c1ac1-64e0-4a05-90c5-923f3bfd1804 service nova] [instance: 89171680-c76d-4826-9236-379542661ffb] Updated VIF entry in instance network info cache for port 53e6f042-9c5f-4263-881b-8f67109a8f72. {{(pid=68906) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1370.841248] env[68906]: DEBUG nova.network.neutron [req-8adbb52a-aef2-4562-8361-e9b8f5311a7b req-bb1c1ac1-64e0-4a05-90c5-923f3bfd1804 service nova] [instance: 89171680-c76d-4826-9236-379542661ffb] Updating instance_info_cache with network_info: [{"id": "53e6f042-9c5f-4263-881b-8f67109a8f72", "address": "fa:16:3e:c8:a7:1d", "network": {"id": "cbbbf860-58e5-4164-8de8-b1492ffc7605", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1183077938-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "da1df204e7064662bf5c15a1598c0d4e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6eb7e3e9-5cc2-40f1-a6eb-f70f06531667", "external-id": "nsx-vlan-transportzone-938", "segmentation_id": 938, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap53e6f042-9c", "ovs_interfaceid": "53e6f042-9c5f-4263-881b-8f67109a8f72", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1370.850155] env[68906]: DEBUG oslo_concurrency.lockutils [req-8adbb52a-aef2-4562-8361-e9b8f5311a7b req-bb1c1ac1-64e0-4a05-90c5-923f3bfd1804 service nova] Releasing lock "refresh_cache-89171680-c76d-4826-9236-379542661ffb" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1376.193418] env[68906]: DEBUG oslo_concurrency.lockutils [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Acquiring lock "17327bc3-433e-4006-93c7-e53714ed70c2" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1376.193729] env[68906]: DEBUG oslo_concurrency.lockutils [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Lock "17327bc3-433e-4006-93c7-e53714ed70c2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1380.564919] env[68906]: DEBUG oslo_concurrency.lockutils [None req-2be6fa1d-6d86-4f82-877d-7c51f7330fb4 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Acquiring lock "89171680-c76d-4826-9236-379542661ffb" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1415.301231] env[68906]: WARNING oslo_vmware.rw_handles [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1415.301231] env[68906]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1415.301231] env[68906]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1415.301231] env[68906]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1415.301231] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1415.301231] env[68906]: ERROR oslo_vmware.rw_handles response.begin() [ 1415.301231] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1415.301231] env[68906]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1415.301231] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1415.301231] env[68906]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1415.301231] env[68906]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1415.301231] env[68906]: ERROR oslo_vmware.rw_handles [ 1415.301928] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Downloaded image file data b1400c31-d33b-4e13-944f-4c645e62493e to vmware_temp/8b3d9d67-80e6-4d1e-9a99-8c2390ef71a5/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1415.303900] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Caching image {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1415.304151] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Copying Virtual Disk [datastore2] vmware_temp/8b3d9d67-80e6-4d1e-9a99-8c2390ef71a5/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk to [datastore2] vmware_temp/8b3d9d67-80e6-4d1e-9a99-8c2390ef71a5/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk {{(pid=68906) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1415.304458] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-c4f4f550-964d-4d49-b908-e233aa79b422 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1415.313257] env[68906]: DEBUG oslo_vmware.api [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Waiting for the task: (returnval){ [ 1415.313257] env[68906]: value = "task-3475390" [ 1415.313257] env[68906]: _type = "Task" [ 1415.313257] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1415.321412] env[68906]: DEBUG oslo_vmware.api [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Task: {'id': task-3475390, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1415.824303] env[68906]: DEBUG oslo_vmware.exceptions [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Fault InvalidArgument not matched. {{(pid=68906) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1415.824670] env[68906]: DEBUG oslo_concurrency.lockutils [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1415.825314] env[68906]: ERROR nova.compute.manager [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1415.825314] env[68906]: Faults: ['InvalidArgument'] [ 1415.825314] env[68906]: ERROR nova.compute.manager [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Traceback (most recent call last): [ 1415.825314] env[68906]: ERROR nova.compute.manager [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1415.825314] env[68906]: ERROR nova.compute.manager [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] yield resources [ 1415.825314] env[68906]: ERROR nova.compute.manager [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1415.825314] env[68906]: ERROR nova.compute.manager [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] self.driver.spawn(context, instance, image_meta, [ 1415.825314] env[68906]: ERROR nova.compute.manager [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1415.825314] env[68906]: ERROR nova.compute.manager [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1415.825314] env[68906]: ERROR nova.compute.manager [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1415.825314] env[68906]: ERROR nova.compute.manager [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] self._fetch_image_if_missing(context, vi) [ 1415.825314] env[68906]: ERROR nova.compute.manager [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1415.825893] env[68906]: ERROR nova.compute.manager [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] image_cache(vi, tmp_image_ds_loc) [ 1415.825893] env[68906]: ERROR nova.compute.manager [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1415.825893] env[68906]: ERROR nova.compute.manager [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] vm_util.copy_virtual_disk( [ 1415.825893] env[68906]: ERROR nova.compute.manager [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1415.825893] env[68906]: ERROR nova.compute.manager [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] session._wait_for_task(vmdk_copy_task) [ 1415.825893] env[68906]: ERROR nova.compute.manager [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1415.825893] env[68906]: ERROR nova.compute.manager [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] return self.wait_for_task(task_ref) [ 1415.825893] env[68906]: ERROR nova.compute.manager [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1415.825893] env[68906]: ERROR nova.compute.manager [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] return evt.wait() [ 1415.825893] env[68906]: ERROR nova.compute.manager [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1415.825893] env[68906]: ERROR nova.compute.manager [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] result = hub.switch() [ 1415.825893] env[68906]: ERROR nova.compute.manager [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1415.825893] env[68906]: ERROR nova.compute.manager [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] return self.greenlet.switch() [ 1415.826877] env[68906]: ERROR nova.compute.manager [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1415.826877] env[68906]: ERROR nova.compute.manager [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] self.f(*self.args, **self.kw) [ 1415.826877] env[68906]: ERROR nova.compute.manager [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1415.826877] env[68906]: ERROR nova.compute.manager [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] raise exceptions.translate_fault(task_info.error) [ 1415.826877] env[68906]: ERROR nova.compute.manager [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1415.826877] env[68906]: ERROR nova.compute.manager [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Faults: ['InvalidArgument'] [ 1415.826877] env[68906]: ERROR nova.compute.manager [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] [ 1415.826877] env[68906]: INFO nova.compute.manager [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Terminating instance [ 1415.827584] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1415.827819] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1415.828080] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e894c690-37da-4cba-b71d-2bb80e44b00c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1415.831127] env[68906]: DEBUG nova.compute.manager [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1415.831327] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1415.832175] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f6dd2117-bbe5-4c74-8f6c-96c47912e89b {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1415.839107] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Unregistering the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1415.839342] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-ad888093-042b-4b41-941e-c25884007bde {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1415.841687] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1415.841870] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68906) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1415.842938] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-222ec2e6-3962-43ae-8b8c-02bc7c22cbb3 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1415.847844] env[68906]: DEBUG oslo_vmware.api [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Waiting for the task: (returnval){ [ 1415.847844] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]529c07ff-7338-061e-6b17-033c322e57e8" [ 1415.847844] env[68906]: _type = "Task" [ 1415.847844] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1415.854875] env[68906]: DEBUG oslo_vmware.api [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]529c07ff-7338-061e-6b17-033c322e57e8, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1415.907835] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Unregistered the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1415.908112] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Deleting contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1415.908245] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Deleting the datastore file [datastore2] e7286888-d79d-4632-9c06-69c1ef47fa50 {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1415.908594] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-0791745f-63bd-4ec8-a969-0e1882a67f9a {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1415.915263] env[68906]: DEBUG oslo_vmware.api [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Waiting for the task: (returnval){ [ 1415.915263] env[68906]: value = "task-3475392" [ 1415.915263] env[68906]: _type = "Task" [ 1415.915263] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1415.923144] env[68906]: DEBUG oslo_vmware.api [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Task: {'id': task-3475392, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1416.358668] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Preparing fetch location {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1416.359014] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Creating directory with path [datastore2] vmware_temp/11359a14-7a18-40db-b52c-dc55b45ab866/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1416.359166] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-75c9321c-71e4-4e6e-91fc-d3ff62f8e7e4 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1416.369493] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Created directory with path [datastore2] vmware_temp/11359a14-7a18-40db-b52c-dc55b45ab866/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1416.369693] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Fetch image to [datastore2] vmware_temp/11359a14-7a18-40db-b52c-dc55b45ab866/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1416.369865] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to [datastore2] vmware_temp/11359a14-7a18-40db-b52c-dc55b45ab866/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1416.370583] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c284a77-6385-443f-86de-c4889ed8d8bb {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1416.376737] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b421b948-7b69-4dc3-b1db-237408bb3055 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1416.385200] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1949814d-8fcf-45c1-974a-79c1046f3c70 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1416.418705] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e849cf8e-cf18-46d3-bce7-4663b7bb83f4 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1416.425754] env[68906]: DEBUG oslo_vmware.api [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Task: {'id': task-3475392, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074502} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1416.427206] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Deleted the datastore file {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1416.427396] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Deleted contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1416.427568] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1416.427744] env[68906]: INFO nova.compute.manager [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1416.429469] env[68906]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-bfb8e731-24be-4167-8608-af04e5bc825e {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1416.431339] env[68906]: DEBUG nova.compute.claims [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Aborting claim: {{(pid=68906) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1416.431526] env[68906]: DEBUG oslo_concurrency.lockutils [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1416.431745] env[68906]: DEBUG oslo_concurrency.lockutils [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1416.456067] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1416.503789] env[68906]: DEBUG oslo_vmware.rw_handles [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/11359a14-7a18-40db-b52c-dc55b45ab866/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1416.563201] env[68906]: DEBUG oslo_vmware.rw_handles [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Completed reading data from the image iterator. {{(pid=68906) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1416.563390] env[68906]: DEBUG oslo_vmware.rw_handles [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/11359a14-7a18-40db-b52c-dc55b45ab866/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1416.769063] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d76e5056-357e-4c27-bd64-32bc2b3902e6 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1416.777535] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-73dbda91-64bb-404b-b9af-6a79febc0f8d {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1416.809328] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fa44e56e-0486-4084-adc9-da2963b6dffc {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1416.815622] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4045d0a6-b43c-455d-b90f-c259c8a0efd6 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1417.507527] env[68906]: DEBUG nova.compute.provider_tree [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1417.517525] env[68906]: DEBUG nova.scheduler.client.report [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1417.530846] env[68906]: DEBUG oslo_concurrency.lockutils [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 1.099s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1417.531602] env[68906]: ERROR nova.compute.manager [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1417.531602] env[68906]: Faults: ['InvalidArgument'] [ 1417.531602] env[68906]: ERROR nova.compute.manager [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Traceback (most recent call last): [ 1417.531602] env[68906]: ERROR nova.compute.manager [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1417.531602] env[68906]: ERROR nova.compute.manager [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] self.driver.spawn(context, instance, image_meta, [ 1417.531602] env[68906]: ERROR nova.compute.manager [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1417.531602] env[68906]: ERROR nova.compute.manager [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1417.531602] env[68906]: ERROR nova.compute.manager [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1417.531602] env[68906]: ERROR nova.compute.manager [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] self._fetch_image_if_missing(context, vi) [ 1417.531602] env[68906]: ERROR nova.compute.manager [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1417.531602] env[68906]: ERROR nova.compute.manager [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] image_cache(vi, tmp_image_ds_loc) [ 1417.531602] env[68906]: ERROR nova.compute.manager [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1417.531934] env[68906]: ERROR nova.compute.manager [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] vm_util.copy_virtual_disk( [ 1417.531934] env[68906]: ERROR nova.compute.manager [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1417.531934] env[68906]: ERROR nova.compute.manager [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] session._wait_for_task(vmdk_copy_task) [ 1417.531934] env[68906]: ERROR nova.compute.manager [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1417.531934] env[68906]: ERROR nova.compute.manager [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] return self.wait_for_task(task_ref) [ 1417.531934] env[68906]: ERROR nova.compute.manager [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1417.531934] env[68906]: ERROR nova.compute.manager [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] return evt.wait() [ 1417.531934] env[68906]: ERROR nova.compute.manager [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1417.531934] env[68906]: ERROR nova.compute.manager [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] result = hub.switch() [ 1417.531934] env[68906]: ERROR nova.compute.manager [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1417.531934] env[68906]: ERROR nova.compute.manager [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] return self.greenlet.switch() [ 1417.531934] env[68906]: ERROR nova.compute.manager [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1417.531934] env[68906]: ERROR nova.compute.manager [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] self.f(*self.args, **self.kw) [ 1417.532241] env[68906]: ERROR nova.compute.manager [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1417.532241] env[68906]: ERROR nova.compute.manager [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] raise exceptions.translate_fault(task_info.error) [ 1417.532241] env[68906]: ERROR nova.compute.manager [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1417.532241] env[68906]: ERROR nova.compute.manager [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Faults: ['InvalidArgument'] [ 1417.532241] env[68906]: ERROR nova.compute.manager [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] [ 1417.532702] env[68906]: DEBUG nova.compute.utils [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] VimFaultException {{(pid=68906) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1417.534477] env[68906]: DEBUG nova.compute.manager [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Build of instance e7286888-d79d-4632-9c06-69c1ef47fa50 was re-scheduled: A specified parameter was not correct: fileType [ 1417.534477] env[68906]: Faults: ['InvalidArgument'] {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1417.534992] env[68906]: DEBUG nova.compute.manager [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Unplugging VIFs for instance {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1417.535251] env[68906]: DEBUG nova.compute.manager [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1417.535494] env[68906]: DEBUG nova.compute.manager [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1417.535732] env[68906]: DEBUG nova.network.neutron [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1417.847544] env[68906]: DEBUG nova.network.neutron [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1417.857852] env[68906]: INFO nova.compute.manager [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Took 0.32 seconds to deallocate network for instance. [ 1417.957022] env[68906]: INFO nova.scheduler.client.report [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Deleted allocations for instance e7286888-d79d-4632-9c06-69c1ef47fa50 [ 1417.976722] env[68906]: DEBUG oslo_concurrency.lockutils [None req-e2cf9422-57fc-4510-acba-da32daf36063 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Lock "e7286888-d79d-4632-9c06-69c1ef47fa50" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 627.191s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1417.978323] env[68906]: DEBUG oslo_concurrency.lockutils [None req-ca7f83ad-d788-4fa0-93cd-bbf7b915f100 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Lock "e7286888-d79d-4632-9c06-69c1ef47fa50" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 431.616s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1417.978514] env[68906]: DEBUG oslo_concurrency.lockutils [None req-ca7f83ad-d788-4fa0-93cd-bbf7b915f100 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Acquiring lock "e7286888-d79d-4632-9c06-69c1ef47fa50-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1417.978710] env[68906]: DEBUG oslo_concurrency.lockutils [None req-ca7f83ad-d788-4fa0-93cd-bbf7b915f100 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Lock "e7286888-d79d-4632-9c06-69c1ef47fa50-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1417.980022] env[68906]: DEBUG oslo_concurrency.lockutils [None req-ca7f83ad-d788-4fa0-93cd-bbf7b915f100 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Lock "e7286888-d79d-4632-9c06-69c1ef47fa50-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1417.980972] env[68906]: INFO nova.compute.manager [None req-ca7f83ad-d788-4fa0-93cd-bbf7b915f100 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Terminating instance [ 1417.984029] env[68906]: DEBUG nova.compute.manager [None req-ca7f83ad-d788-4fa0-93cd-bbf7b915f100 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1417.984029] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-ca7f83ad-d788-4fa0-93cd-bbf7b915f100 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1417.984506] env[68906]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-73364b95-ee5f-4163-bcd6-8afbf2eb9452 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1417.994257] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-36e2f99f-fe5e-4905-be73-c63125d96ae3 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1418.005914] env[68906]: DEBUG nova.compute.manager [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1418.027878] env[68906]: WARNING nova.virt.vmwareapi.vmops [None req-ca7f83ad-d788-4fa0-93cd-bbf7b915f100 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance e7286888-d79d-4632-9c06-69c1ef47fa50 could not be found. [ 1418.028106] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-ca7f83ad-d788-4fa0-93cd-bbf7b915f100 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1418.028284] env[68906]: INFO nova.compute.manager [None req-ca7f83ad-d788-4fa0-93cd-bbf7b915f100 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1418.028537] env[68906]: DEBUG oslo.service.loopingcall [None req-ca7f83ad-d788-4fa0-93cd-bbf7b915f100 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1418.028760] env[68906]: DEBUG nova.compute.manager [-] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1418.028855] env[68906]: DEBUG nova.network.neutron [-] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1418.051071] env[68906]: DEBUG nova.network.neutron [-] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1418.054680] env[68906]: DEBUG oslo_concurrency.lockutils [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1418.054951] env[68906]: DEBUG oslo_concurrency.lockutils [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1418.056292] env[68906]: INFO nova.compute.claims [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1418.059328] env[68906]: INFO nova.compute.manager [-] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] Took 0.03 seconds to deallocate network for instance. [ 1418.148028] env[68906]: DEBUG oslo_concurrency.lockutils [None req-ca7f83ad-d788-4fa0-93cd-bbf7b915f100 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Lock "e7286888-d79d-4632-9c06-69c1ef47fa50" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.169s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1418.148370] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "e7286888-d79d-4632-9c06-69c1ef47fa50" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 207.196s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1418.148612] env[68906]: INFO nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: e7286888-d79d-4632-9c06-69c1ef47fa50] During sync_power_state the instance has a pending task (deleting). Skip. [ 1418.148800] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "e7286888-d79d-4632-9c06-69c1ef47fa50" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1418.355087] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f3f98f41-ba74-4743-badf-14b3bcc824c0 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1418.362475] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-64d98c4b-f4dc-4019-b433-a4ff273b8e68 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1418.391053] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c42dab4-1d98-49fc-9106-830d952a68f4 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1418.397523] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-620d2763-e342-46d6-ad95-797378c0ec6d {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1418.410024] env[68906]: DEBUG nova.compute.provider_tree [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1418.419012] env[68906]: DEBUG nova.scheduler.client.report [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1418.432283] env[68906]: DEBUG oslo_concurrency.lockutils [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.377s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1418.432573] env[68906]: DEBUG nova.compute.manager [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Start building networks asynchronously for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1418.469302] env[68906]: DEBUG nova.compute.utils [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Using /dev/sd instead of None {{(pid=68906) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1418.470567] env[68906]: DEBUG nova.compute.manager [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Allocating IP information in the background. {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1418.470739] env[68906]: DEBUG nova.network.neutron [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] allocate_for_instance() {{(pid=68906) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1418.480288] env[68906]: DEBUG nova.compute.manager [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Start building block device mappings for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1418.529155] env[68906]: DEBUG nova.policy [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd1caf803860c40fb83148df6917edfc9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8ca25c17ed7d44a487aed613b73139d6', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68906) authorize /opt/stack/nova/nova/policy.py:203}} [ 1418.567030] env[68906]: DEBUG nova.compute.manager [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Start spawning the instance on the hypervisor. {{(pid=68906) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1418.592096] env[68906]: DEBUG nova.virt.hardware [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T13:00:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T13:00:23Z,direct_url=,disk_format='vmdk',id=b1400c31-d33b-4e13-944f-4c645e62493e,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='1ae7bf3a375d41c6af5e7536af51ffd1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T13:00:24Z,virtual_size=,visibility=), allow threads: False {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1418.592352] env[68906]: DEBUG nova.virt.hardware [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Flavor limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1418.592511] env[68906]: DEBUG nova.virt.hardware [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Image limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1418.592690] env[68906]: DEBUG nova.virt.hardware [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Flavor pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1418.592834] env[68906]: DEBUG nova.virt.hardware [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Image pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1418.592979] env[68906]: DEBUG nova.virt.hardware [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1418.593208] env[68906]: DEBUG nova.virt.hardware [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1418.593365] env[68906]: DEBUG nova.virt.hardware [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1418.593528] env[68906]: DEBUG nova.virt.hardware [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Got 1 possible topologies {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1418.593688] env[68906]: DEBUG nova.virt.hardware [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1418.593856] env[68906]: DEBUG nova.virt.hardware [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1418.594722] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-353897ba-eeb8-4243-9004-9a9186b3c074 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1418.602396] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4183f58e-3137-4244-8ea2-40806b615875 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1418.970334] env[68906]: DEBUG nova.network.neutron [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Successfully created port: 30d3b7ae-8e1d-43d8-88f9-eb869b77ad89 {{(pid=68906) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1419.536470] env[68906]: DEBUG nova.network.neutron [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Successfully updated port: 30d3b7ae-8e1d-43d8-88f9-eb869b77ad89 {{(pid=68906) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1419.547463] env[68906]: DEBUG oslo_concurrency.lockutils [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Acquiring lock "refresh_cache-9b884416-df89-4d8c-b2ab-0667db52a718" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1419.547618] env[68906]: DEBUG oslo_concurrency.lockutils [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Acquired lock "refresh_cache-9b884416-df89-4d8c-b2ab-0667db52a718" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1419.547768] env[68906]: DEBUG nova.network.neutron [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Building network info cache for instance {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1419.585756] env[68906]: DEBUG nova.network.neutron [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Instance cache missing network info. {{(pid=68906) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1419.746151] env[68906]: DEBUG nova.network.neutron [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Updating instance_info_cache with network_info: [{"id": "30d3b7ae-8e1d-43d8-88f9-eb869b77ad89", "address": "fa:16:3e:20:cc:0a", "network": {"id": "e0b0fc98-cd6c-426d-bfd3-c86de3314024", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-2016737718-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8ca25c17ed7d44a487aed613b73139d6", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e5d88cd9-35a3-4ac3-9d6d-756464cd6cc5", "external-id": "nsx-vlan-transportzone-685", "segmentation_id": 685, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap30d3b7ae-8e", "ovs_interfaceid": "30d3b7ae-8e1d-43d8-88f9-eb869b77ad89", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1419.760180] env[68906]: DEBUG oslo_concurrency.lockutils [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Releasing lock "refresh_cache-9b884416-df89-4d8c-b2ab-0667db52a718" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1419.760180] env[68906]: DEBUG nova.compute.manager [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Instance network_info: |[{"id": "30d3b7ae-8e1d-43d8-88f9-eb869b77ad89", "address": "fa:16:3e:20:cc:0a", "network": {"id": "e0b0fc98-cd6c-426d-bfd3-c86de3314024", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-2016737718-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8ca25c17ed7d44a487aed613b73139d6", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e5d88cd9-35a3-4ac3-9d6d-756464cd6cc5", "external-id": "nsx-vlan-transportzone-685", "segmentation_id": 685, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap30d3b7ae-8e", "ovs_interfaceid": "30d3b7ae-8e1d-43d8-88f9-eb869b77ad89", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1419.760302] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:20:cc:0a', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'e5d88cd9-35a3-4ac3-9d6d-756464cd6cc5', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '30d3b7ae-8e1d-43d8-88f9-eb869b77ad89', 'vif_model': 'vmxnet3'}] {{(pid=68906) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1419.767588] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Creating folder: Project (8ca25c17ed7d44a487aed613b73139d6). Parent ref: group-v694750. {{(pid=68906) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1419.768108] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e55a062d-8dd4-405b-9a8e-94e50e79ccd8 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1419.779794] env[68906]: INFO nova.virt.vmwareapi.vm_util [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Created folder: Project (8ca25c17ed7d44a487aed613b73139d6) in parent group-v694750. [ 1419.779994] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Creating folder: Instances. Parent ref: group-v694829. {{(pid=68906) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1419.780236] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-24d73226-c51a-42c6-847c-faf37a3a930d {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1419.788488] env[68906]: INFO nova.virt.vmwareapi.vm_util [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Created folder: Instances in parent group-v694829. [ 1419.788709] env[68906]: DEBUG oslo.service.loopingcall [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1419.788885] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Creating VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1419.789087] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-e5d4f40e-acdc-4bf2-9df5-7586c1ebbd02 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1419.808634] env[68906]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1419.808634] env[68906]: value = "task-3475395" [ 1419.808634] env[68906]: _type = "Task" [ 1419.808634] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1419.815572] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475395, 'name': CreateVM_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1419.846563] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1419.941167] env[68906]: DEBUG nova.compute.manager [req-aeba0bef-9c1f-41f7-b729-929222a5890f req-489e8f7f-3f22-42b9-a464-a997f5a744c9 service nova] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Received event network-vif-plugged-30d3b7ae-8e1d-43d8-88f9-eb869b77ad89 {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1419.941167] env[68906]: DEBUG oslo_concurrency.lockutils [req-aeba0bef-9c1f-41f7-b729-929222a5890f req-489e8f7f-3f22-42b9-a464-a997f5a744c9 service nova] Acquiring lock "9b884416-df89-4d8c-b2ab-0667db52a718-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1419.941167] env[68906]: DEBUG oslo_concurrency.lockutils [req-aeba0bef-9c1f-41f7-b729-929222a5890f req-489e8f7f-3f22-42b9-a464-a997f5a744c9 service nova] Lock "9b884416-df89-4d8c-b2ab-0667db52a718-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1419.941167] env[68906]: DEBUG oslo_concurrency.lockutils [req-aeba0bef-9c1f-41f7-b729-929222a5890f req-489e8f7f-3f22-42b9-a464-a997f5a744c9 service nova] Lock "9b884416-df89-4d8c-b2ab-0667db52a718-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1419.941287] env[68906]: DEBUG nova.compute.manager [req-aeba0bef-9c1f-41f7-b729-929222a5890f req-489e8f7f-3f22-42b9-a464-a997f5a744c9 service nova] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] No waiting events found dispatching network-vif-plugged-30d3b7ae-8e1d-43d8-88f9-eb869b77ad89 {{(pid=68906) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1419.941287] env[68906]: WARNING nova.compute.manager [req-aeba0bef-9c1f-41f7-b729-929222a5890f req-489e8f7f-3f22-42b9-a464-a997f5a744c9 service nova] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Received unexpected event network-vif-plugged-30d3b7ae-8e1d-43d8-88f9-eb869b77ad89 for instance with vm_state building and task_state spawning. [ 1419.941287] env[68906]: DEBUG nova.compute.manager [req-aeba0bef-9c1f-41f7-b729-929222a5890f req-489e8f7f-3f22-42b9-a464-a997f5a744c9 service nova] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Received event network-changed-30d3b7ae-8e1d-43d8-88f9-eb869b77ad89 {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1419.941287] env[68906]: DEBUG nova.compute.manager [req-aeba0bef-9c1f-41f7-b729-929222a5890f req-489e8f7f-3f22-42b9-a464-a997f5a744c9 service nova] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Refreshing instance network info cache due to event network-changed-30d3b7ae-8e1d-43d8-88f9-eb869b77ad89. {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1419.941287] env[68906]: DEBUG oslo_concurrency.lockutils [req-aeba0bef-9c1f-41f7-b729-929222a5890f req-489e8f7f-3f22-42b9-a464-a997f5a744c9 service nova] Acquiring lock "refresh_cache-9b884416-df89-4d8c-b2ab-0667db52a718" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1419.941412] env[68906]: DEBUG oslo_concurrency.lockutils [req-aeba0bef-9c1f-41f7-b729-929222a5890f req-489e8f7f-3f22-42b9-a464-a997f5a744c9 service nova] Acquired lock "refresh_cache-9b884416-df89-4d8c-b2ab-0667db52a718" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1419.941412] env[68906]: DEBUG nova.network.neutron [req-aeba0bef-9c1f-41f7-b729-929222a5890f req-489e8f7f-3f22-42b9-a464-a997f5a744c9 service nova] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Refreshing network info cache for port 30d3b7ae-8e1d-43d8-88f9-eb869b77ad89 {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1420.196605] env[68906]: DEBUG nova.network.neutron [req-aeba0bef-9c1f-41f7-b729-929222a5890f req-489e8f7f-3f22-42b9-a464-a997f5a744c9 service nova] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Updated VIF entry in instance network info cache for port 30d3b7ae-8e1d-43d8-88f9-eb869b77ad89. {{(pid=68906) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1420.197016] env[68906]: DEBUG nova.network.neutron [req-aeba0bef-9c1f-41f7-b729-929222a5890f req-489e8f7f-3f22-42b9-a464-a997f5a744c9 service nova] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Updating instance_info_cache with network_info: [{"id": "30d3b7ae-8e1d-43d8-88f9-eb869b77ad89", "address": "fa:16:3e:20:cc:0a", "network": {"id": "e0b0fc98-cd6c-426d-bfd3-c86de3314024", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-2016737718-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8ca25c17ed7d44a487aed613b73139d6", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e5d88cd9-35a3-4ac3-9d6d-756464cd6cc5", "external-id": "nsx-vlan-transportzone-685", "segmentation_id": 685, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap30d3b7ae-8e", "ovs_interfaceid": "30d3b7ae-8e1d-43d8-88f9-eb869b77ad89", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1420.206649] env[68906]: DEBUG oslo_concurrency.lockutils [req-aeba0bef-9c1f-41f7-b729-929222a5890f req-489e8f7f-3f22-42b9-a464-a997f5a744c9 service nova] Releasing lock "refresh_cache-9b884416-df89-4d8c-b2ab-0667db52a718" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1420.320399] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475395, 'name': CreateVM_Task, 'duration_secs': 0.296139} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1420.320399] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Created VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1420.320988] env[68906]: DEBUG oslo_concurrency.lockutils [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1420.321165] env[68906]: DEBUG oslo_concurrency.lockutils [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1420.321479] env[68906]: DEBUG oslo_concurrency.lockutils [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1420.321758] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5a69f832-7316-4f99-8665-cea24b983b10 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1420.326103] env[68906]: DEBUG oslo_vmware.api [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Waiting for the task: (returnval){ [ 1420.326103] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]522b5663-c3f9-4ccb-5562-e0a3b3d49b6e" [ 1420.326103] env[68906]: _type = "Task" [ 1420.326103] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1420.333445] env[68906]: DEBUG oslo_vmware.api [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]522b5663-c3f9-4ccb-5562-e0a3b3d49b6e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1420.836677] env[68906]: DEBUG oslo_concurrency.lockutils [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1420.836677] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Processing image b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1420.836677] env[68906]: DEBUG oslo_concurrency.lockutils [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1421.140590] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1422.048889] env[68906]: DEBUG oslo_concurrency.lockutils [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Acquiring lock "32f5b54d-30bf-4fe9-9622-3ff74344b3f3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1422.049140] env[68906]: DEBUG oslo_concurrency.lockutils [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Lock "32f5b54d-30bf-4fe9-9622-3ff74344b3f3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1424.139937] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1425.140755] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1425.141202] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Starting heal instance info cache {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1425.141202] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Rebuilding the list of instances to heal {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1425.164631] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1425.164881] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1425.165114] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1425.165325] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: db011373-7285-4882-8bce-d39cfa22fe80] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1425.165514] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1425.165657] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1425.165780] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1425.165899] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1425.166027] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 89171680-c76d-4826-9236-379542661ffb] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1425.166148] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1425.166265] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Didn't find any instances for network info cache update. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1425.166748] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1425.166891] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68906) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1426.140625] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1428.135672] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1428.140420] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1429.141480] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1430.140623] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager.update_available_resource {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1430.154099] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1430.154099] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1430.154099] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1430.154099] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68906) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1430.154826] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-075b42de-2e30-42ef-99be-a81ab12c96d4 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1430.165318] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-637cd136-3436-435c-9b51-b6fd2bbb2cac {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1430.179860] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-071fee2c-e80a-42ce-af57-f78d91dde291 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1430.186655] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0882b656-d1ef-4975-93dd-5dbae2f3d01d {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1430.221379] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180964MB free_disk=93GB free_vcpus=48 pci_devices=None {{(pid=68906) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1430.221558] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1430.221775] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1430.301800] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 641cca5b-d749-4331-a5e0-8acb6d47cba2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1430.301800] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1430.301800] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 4d36bb91-0cde-44cb-8706-d17740a9cf50 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1430.301800] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance db011373-7285-4882-8bce-d39cfa22fe80 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1430.302034] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 1fdb401a-ac25-4418-803c-fc0b2297f2d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1430.302034] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1430.302089] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance e0e595e3-e47e-4cf1-8977-f004eca942d1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1430.302202] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 7466df8a-59a9-49b9-bff7-c4efbeae3eee actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1430.302322] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 89171680-c76d-4826-9236-379542661ffb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1430.302435] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 9b884416-df89-4d8c-b2ab-0667db52a718 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1430.314687] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 917ba3c3-9188-40fa-be6c-cdab27b76970 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1430.325176] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 7803f951-a0c0-4246-b2d9-3eabadfa679d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1430.356615] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 8a4e18b6-55c0-4397-b570-27db4541e9b3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1430.368816] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 2b688987-d4cf-4ebb-83c4-d5fa7f5bcbb9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1430.379526] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 3ce59687-c677-40bd-8af4-c2f4b576e86e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1430.389923] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 45c0d7ba-6d21-46d1-8bcb-0318bd93f885 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1430.400940] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 75a4f8bc-09aa-4c9b-b705-fb84ddcf60ea has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1430.411072] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance ff99f1e3-9a4a-487e-afcb-6d8439a0491d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1430.420800] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance aed06616-d008-4695-b66e-9f40acf5ebd3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1430.432253] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance c3386804-6ed9-46fe-b26d-3b5aae52c84b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1430.442932] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 17327bc3-433e-4006-93c7-e53714ed70c2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1430.456437] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 32f5b54d-30bf-4fe9-9622-3ff74344b3f3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1430.456680] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1430.456827] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1430.694716] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bb2957e4-bbba-4847-ae42-977e837c5186 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1430.702698] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-767f8c99-114c-4416-b7ce-0b1034b6f8a3 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1430.731276] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-53322050-ca77-413c-9591-84a853fc184f {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1430.738396] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4cb49242-5f6f-49b5-affe-43e76f9b6e4e {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1430.751553] env[68906]: DEBUG nova.compute.provider_tree [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1430.759585] env[68906]: DEBUG nova.scheduler.client.report [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1430.773847] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68906) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1430.774031] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.552s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1430.883175] env[68906]: DEBUG oslo_concurrency.lockutils [None req-d21cc286-8d6d-4956-a84c-6659f57d9db9 tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Acquiring lock "9b884416-df89-4d8c-b2ab-0667db52a718" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1451.909402] env[68906]: DEBUG oslo_concurrency.lockutils [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Acquiring lock "922d81ba-c8d2-43ba-b1c5-f2943418d6a2" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1451.909688] env[68906]: DEBUG oslo_concurrency.lockutils [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Lock "922d81ba-c8d2-43ba-b1c5-f2943418d6a2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1455.309914] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fbc84413-841c-44c5-a0d9-1ff0baa7121b tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Acquiring lock "302e2275-a3ec-48c5-899e-6f385190bfe8" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1455.310328] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fbc84413-841c-44c5-a0d9-1ff0baa7121b tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Lock "302e2275-a3ec-48c5-899e-6f385190bfe8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1457.154242] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bf37900d-8159-4835-aaeb-3f0e048a23be tempest-SecurityGroupsTestJSON-973572118 tempest-SecurityGroupsTestJSON-973572118-project-member] Acquiring lock "a59ab448-c4f1-4f54-be7a-7e204130f3f8" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1457.154610] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bf37900d-8159-4835-aaeb-3f0e048a23be tempest-SecurityGroupsTestJSON-973572118 tempest-SecurityGroupsTestJSON-973572118-project-member] Lock "a59ab448-c4f1-4f54-be7a-7e204130f3f8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1462.768045] env[68906]: WARNING oslo_vmware.rw_handles [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1462.768045] env[68906]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1462.768045] env[68906]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1462.768045] env[68906]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1462.768045] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1462.768045] env[68906]: ERROR oslo_vmware.rw_handles response.begin() [ 1462.768045] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1462.768045] env[68906]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1462.768045] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1462.768045] env[68906]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1462.768045] env[68906]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1462.768045] env[68906]: ERROR oslo_vmware.rw_handles [ 1462.768045] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Downloaded image file data b1400c31-d33b-4e13-944f-4c645e62493e to vmware_temp/11359a14-7a18-40db-b52c-dc55b45ab866/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1462.770557] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Caching image {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1462.770820] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Copying Virtual Disk [datastore2] vmware_temp/11359a14-7a18-40db-b52c-dc55b45ab866/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk to [datastore2] vmware_temp/11359a14-7a18-40db-b52c-dc55b45ab866/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk {{(pid=68906) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1462.771113] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-24a0ab56-79d9-4eed-b5e9-1f6597a45e7a {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1462.779080] env[68906]: DEBUG oslo_vmware.api [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Waiting for the task: (returnval){ [ 1462.779080] env[68906]: value = "task-3475396" [ 1462.779080] env[68906]: _type = "Task" [ 1462.779080] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1462.787622] env[68906]: DEBUG oslo_vmware.api [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Task: {'id': task-3475396, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1463.290109] env[68906]: DEBUG oslo_vmware.exceptions [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Fault InvalidArgument not matched. {{(pid=68906) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1463.290416] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1463.290982] env[68906]: ERROR nova.compute.manager [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1463.290982] env[68906]: Faults: ['InvalidArgument'] [ 1463.290982] env[68906]: ERROR nova.compute.manager [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Traceback (most recent call last): [ 1463.290982] env[68906]: ERROR nova.compute.manager [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1463.290982] env[68906]: ERROR nova.compute.manager [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] yield resources [ 1463.290982] env[68906]: ERROR nova.compute.manager [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1463.290982] env[68906]: ERROR nova.compute.manager [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] self.driver.spawn(context, instance, image_meta, [ 1463.290982] env[68906]: ERROR nova.compute.manager [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1463.290982] env[68906]: ERROR nova.compute.manager [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1463.290982] env[68906]: ERROR nova.compute.manager [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1463.290982] env[68906]: ERROR nova.compute.manager [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] self._fetch_image_if_missing(context, vi) [ 1463.290982] env[68906]: ERROR nova.compute.manager [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1463.291331] env[68906]: ERROR nova.compute.manager [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] image_cache(vi, tmp_image_ds_loc) [ 1463.291331] env[68906]: ERROR nova.compute.manager [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1463.291331] env[68906]: ERROR nova.compute.manager [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] vm_util.copy_virtual_disk( [ 1463.291331] env[68906]: ERROR nova.compute.manager [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1463.291331] env[68906]: ERROR nova.compute.manager [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] session._wait_for_task(vmdk_copy_task) [ 1463.291331] env[68906]: ERROR nova.compute.manager [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1463.291331] env[68906]: ERROR nova.compute.manager [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] return self.wait_for_task(task_ref) [ 1463.291331] env[68906]: ERROR nova.compute.manager [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1463.291331] env[68906]: ERROR nova.compute.manager [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] return evt.wait() [ 1463.291331] env[68906]: ERROR nova.compute.manager [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1463.291331] env[68906]: ERROR nova.compute.manager [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] result = hub.switch() [ 1463.291331] env[68906]: ERROR nova.compute.manager [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1463.291331] env[68906]: ERROR nova.compute.manager [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] return self.greenlet.switch() [ 1463.292049] env[68906]: ERROR nova.compute.manager [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1463.292049] env[68906]: ERROR nova.compute.manager [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] self.f(*self.args, **self.kw) [ 1463.292049] env[68906]: ERROR nova.compute.manager [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1463.292049] env[68906]: ERROR nova.compute.manager [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] raise exceptions.translate_fault(task_info.error) [ 1463.292049] env[68906]: ERROR nova.compute.manager [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1463.292049] env[68906]: ERROR nova.compute.manager [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Faults: ['InvalidArgument'] [ 1463.292049] env[68906]: ERROR nova.compute.manager [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] [ 1463.292049] env[68906]: INFO nova.compute.manager [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Terminating instance [ 1463.293761] env[68906]: DEBUG nova.compute.manager [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1463.293985] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1463.294298] env[68906]: DEBUG oslo_concurrency.lockutils [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1463.294494] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1463.295268] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-baa74f6d-bda7-4a13-b812-82dfce45296d {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1463.298319] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9fb1b46d-2658-4a16-ac93-7890d3bd1c5f {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1463.304557] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Unregistering the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1463.304810] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-3cafa5ca-1bc9-47a1-ac2c-31984b265eed {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1463.307147] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1463.307326] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68906) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1463.308322] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-98f3d61c-b165-4133-8fa2-e606d2e2f534 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1463.314656] env[68906]: DEBUG oslo_vmware.api [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Waiting for the task: (returnval){ [ 1463.314656] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]52a5ef3c-467a-86d8-1114-6dd704d074fe" [ 1463.314656] env[68906]: _type = "Task" [ 1463.314656] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1463.321828] env[68906]: DEBUG oslo_vmware.api [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]52a5ef3c-467a-86d8-1114-6dd704d074fe, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1463.377541] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Unregistered the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1463.377789] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Deleting contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1463.377973] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Deleting the datastore file [datastore2] 641cca5b-d749-4331-a5e0-8acb6d47cba2 {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1463.378256] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-c4352177-4505-4747-baf0-906a46f017b6 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1463.386233] env[68906]: DEBUG oslo_vmware.api [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Waiting for the task: (returnval){ [ 1463.386233] env[68906]: value = "task-3475398" [ 1463.386233] env[68906]: _type = "Task" [ 1463.386233] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1463.394162] env[68906]: DEBUG oslo_vmware.api [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Task: {'id': task-3475398, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1463.825561] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Preparing fetch location {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1463.825852] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Creating directory with path [datastore2] vmware_temp/ea0d0c3b-80ed-4819-8393-c05c4a12471a/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1463.826072] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-46a0a01a-838e-4144-bd77-3c69a72bc123 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1463.836719] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Created directory with path [datastore2] vmware_temp/ea0d0c3b-80ed-4819-8393-c05c4a12471a/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1463.836910] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Fetch image to [datastore2] vmware_temp/ea0d0c3b-80ed-4819-8393-c05c4a12471a/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1463.837087] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to [datastore2] vmware_temp/ea0d0c3b-80ed-4819-8393-c05c4a12471a/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1463.837848] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-acf5f86e-b808-48dc-a5ae-67f0b9c4de07 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1463.844637] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-84b997ad-05b8-42ae-b700-a74a2cd6dd3b {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1463.853492] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c44bbbb-1879-4c3c-844d-6848cb2369de {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1463.883115] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c8d53f2-aeba-4402-af53-eba033edde72 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1463.891087] env[68906]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-b18a1020-24aa-4aee-8612-80e3c21c294a {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1463.895318] env[68906]: DEBUG oslo_vmware.api [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Task: {'id': task-3475398, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.070573} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1463.895714] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Deleted the datastore file {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1463.895896] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Deleted contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1463.896084] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1463.896287] env[68906]: INFO nova.compute.manager [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1463.898668] env[68906]: DEBUG nova.compute.claims [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Aborting claim: {{(pid=68906) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1463.898835] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1463.899061] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1463.912840] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1463.964274] env[68906]: DEBUG oslo_vmware.rw_handles [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ea0d0c3b-80ed-4819-8393-c05c4a12471a/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1464.022788] env[68906]: DEBUG oslo_vmware.rw_handles [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Completed reading data from the image iterator. {{(pid=68906) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1464.022988] env[68906]: DEBUG oslo_vmware.rw_handles [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ea0d0c3b-80ed-4819-8393-c05c4a12471a/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1464.213231] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-39176833-cfdd-4e72-af3a-8cfdc41308c5 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1464.220979] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-39260331-c121-4ec7-9ced-2d0a3baf198a {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1464.250960] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8d8f03ac-f672-4018-a103-e26acc9e9209 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1464.258345] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-337e2a40-019e-4f7c-bcf7-01cddcb2718c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1464.271472] env[68906]: DEBUG nova.compute.provider_tree [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1464.280727] env[68906]: DEBUG nova.scheduler.client.report [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1464.297184] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.398s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1464.297729] env[68906]: ERROR nova.compute.manager [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1464.297729] env[68906]: Faults: ['InvalidArgument'] [ 1464.297729] env[68906]: ERROR nova.compute.manager [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Traceback (most recent call last): [ 1464.297729] env[68906]: ERROR nova.compute.manager [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1464.297729] env[68906]: ERROR nova.compute.manager [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] self.driver.spawn(context, instance, image_meta, [ 1464.297729] env[68906]: ERROR nova.compute.manager [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1464.297729] env[68906]: ERROR nova.compute.manager [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1464.297729] env[68906]: ERROR nova.compute.manager [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1464.297729] env[68906]: ERROR nova.compute.manager [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] self._fetch_image_if_missing(context, vi) [ 1464.297729] env[68906]: ERROR nova.compute.manager [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1464.297729] env[68906]: ERROR nova.compute.manager [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] image_cache(vi, tmp_image_ds_loc) [ 1464.297729] env[68906]: ERROR nova.compute.manager [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1464.298048] env[68906]: ERROR nova.compute.manager [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] vm_util.copy_virtual_disk( [ 1464.298048] env[68906]: ERROR nova.compute.manager [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1464.298048] env[68906]: ERROR nova.compute.manager [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] session._wait_for_task(vmdk_copy_task) [ 1464.298048] env[68906]: ERROR nova.compute.manager [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1464.298048] env[68906]: ERROR nova.compute.manager [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] return self.wait_for_task(task_ref) [ 1464.298048] env[68906]: ERROR nova.compute.manager [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1464.298048] env[68906]: ERROR nova.compute.manager [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] return evt.wait() [ 1464.298048] env[68906]: ERROR nova.compute.manager [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1464.298048] env[68906]: ERROR nova.compute.manager [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] result = hub.switch() [ 1464.298048] env[68906]: ERROR nova.compute.manager [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1464.298048] env[68906]: ERROR nova.compute.manager [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] return self.greenlet.switch() [ 1464.298048] env[68906]: ERROR nova.compute.manager [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1464.298048] env[68906]: ERROR nova.compute.manager [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] self.f(*self.args, **self.kw) [ 1464.298383] env[68906]: ERROR nova.compute.manager [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1464.298383] env[68906]: ERROR nova.compute.manager [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] raise exceptions.translate_fault(task_info.error) [ 1464.298383] env[68906]: ERROR nova.compute.manager [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1464.298383] env[68906]: ERROR nova.compute.manager [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Faults: ['InvalidArgument'] [ 1464.298383] env[68906]: ERROR nova.compute.manager [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] [ 1464.298526] env[68906]: DEBUG nova.compute.utils [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] VimFaultException {{(pid=68906) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1464.300466] env[68906]: DEBUG nova.compute.manager [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Build of instance 641cca5b-d749-4331-a5e0-8acb6d47cba2 was re-scheduled: A specified parameter was not correct: fileType [ 1464.300466] env[68906]: Faults: ['InvalidArgument'] {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1464.300848] env[68906]: DEBUG nova.compute.manager [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Unplugging VIFs for instance {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1464.301030] env[68906]: DEBUG nova.compute.manager [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1464.301206] env[68906]: DEBUG nova.compute.manager [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1464.301371] env[68906]: DEBUG nova.network.neutron [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1464.745052] env[68906]: DEBUG nova.network.neutron [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1464.756376] env[68906]: INFO nova.compute.manager [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Took 0.45 seconds to deallocate network for instance. [ 1464.873089] env[68906]: INFO nova.scheduler.client.report [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Deleted allocations for instance 641cca5b-d749-4331-a5e0-8acb6d47cba2 [ 1464.902612] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8e5cf191-d2ff-420a-92ea-3d8a1505aeef tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Lock "641cca5b-d749-4331-a5e0-8acb6d47cba2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 627.682s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1464.902612] env[68906]: DEBUG oslo_concurrency.lockutils [None req-a836aaa4-8a06-4a84-954e-ba88feaacba8 tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Lock "641cca5b-d749-4331-a5e0-8acb6d47cba2" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 431.487s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1464.902612] env[68906]: DEBUG oslo_concurrency.lockutils [None req-a836aaa4-8a06-4a84-954e-ba88feaacba8 tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Acquiring lock "641cca5b-d749-4331-a5e0-8acb6d47cba2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1464.902793] env[68906]: DEBUG oslo_concurrency.lockutils [None req-a836aaa4-8a06-4a84-954e-ba88feaacba8 tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Lock "641cca5b-d749-4331-a5e0-8acb6d47cba2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1464.902793] env[68906]: DEBUG oslo_concurrency.lockutils [None req-a836aaa4-8a06-4a84-954e-ba88feaacba8 tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Lock "641cca5b-d749-4331-a5e0-8acb6d47cba2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1464.908560] env[68906]: INFO nova.compute.manager [None req-a836aaa4-8a06-4a84-954e-ba88feaacba8 tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Terminating instance [ 1464.913567] env[68906]: DEBUG nova.compute.manager [None req-a836aaa4-8a06-4a84-954e-ba88feaacba8 tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1464.913772] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-a836aaa4-8a06-4a84-954e-ba88feaacba8 tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1464.915895] env[68906]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-103305ed-8923-4b59-a4d4-7a5c4c135b46 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1464.924146] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7dbc41c8-887b-4286-94ed-b40100f36e76 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1464.937926] env[68906]: DEBUG nova.compute.manager [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] [instance: 917ba3c3-9188-40fa-be6c-cdab27b76970] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1464.957148] env[68906]: WARNING nova.virt.vmwareapi.vmops [None req-a836aaa4-8a06-4a84-954e-ba88feaacba8 tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 641cca5b-d749-4331-a5e0-8acb6d47cba2 could not be found. [ 1464.957148] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-a836aaa4-8a06-4a84-954e-ba88feaacba8 tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1464.957148] env[68906]: INFO nova.compute.manager [None req-a836aaa4-8a06-4a84-954e-ba88feaacba8 tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1464.957148] env[68906]: DEBUG oslo.service.loopingcall [None req-a836aaa4-8a06-4a84-954e-ba88feaacba8 tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1464.957148] env[68906]: DEBUG nova.compute.manager [-] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1464.957553] env[68906]: DEBUG nova.network.neutron [-] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1464.965639] env[68906]: DEBUG nova.compute.manager [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] [instance: 917ba3c3-9188-40fa-be6c-cdab27b76970] Instance disappeared before build. {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1464.982132] env[68906]: DEBUG nova.network.neutron [-] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1464.989715] env[68906]: DEBUG oslo_concurrency.lockutils [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Lock "917ba3c3-9188-40fa-be6c-cdab27b76970" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 229.789s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1464.991133] env[68906]: INFO nova.compute.manager [-] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] Took 0.03 seconds to deallocate network for instance. [ 1465.005282] env[68906]: DEBUG nova.compute.manager [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] [instance: 7803f951-a0c0-4246-b2d9-3eabadfa679d] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1465.033526] env[68906]: DEBUG nova.compute.manager [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] [instance: 7803f951-a0c0-4246-b2d9-3eabadfa679d] Instance disappeared before build. {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1465.055092] env[68906]: DEBUG oslo_concurrency.lockutils [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Lock "7803f951-a0c0-4246-b2d9-3eabadfa679d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 229.825s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1465.064873] env[68906]: DEBUG nova.compute.manager [None req-c44660d0-cc4e-4ff6-b5ef-48f4a756fdda tempest-ServersAdminNegativeTestJSON-1434965427 tempest-ServersAdminNegativeTestJSON-1434965427-project-member] [instance: 8a4e18b6-55c0-4397-b570-27db4541e9b3] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1465.098040] env[68906]: DEBUG oslo_concurrency.lockutils [None req-a836aaa4-8a06-4a84-954e-ba88feaacba8 tempest-ServersTestManualDisk-672681136 tempest-ServersTestManualDisk-672681136-project-member] Lock "641cca5b-d749-4331-a5e0-8acb6d47cba2" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.196s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1465.098945] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "641cca5b-d749-4331-a5e0-8acb6d47cba2" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 254.146s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1465.099255] env[68906]: INFO nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 641cca5b-d749-4331-a5e0-8acb6d47cba2] During sync_power_state the instance has a pending task (deleting). Skip. [ 1465.103075] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "641cca5b-d749-4331-a5e0-8acb6d47cba2" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1465.103075] env[68906]: DEBUG nova.compute.manager [None req-c44660d0-cc4e-4ff6-b5ef-48f4a756fdda tempest-ServersAdminNegativeTestJSON-1434965427 tempest-ServersAdminNegativeTestJSON-1434965427-project-member] [instance: 8a4e18b6-55c0-4397-b570-27db4541e9b3] Instance disappeared before build. {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1465.126319] env[68906]: DEBUG oslo_concurrency.lockutils [None req-c44660d0-cc4e-4ff6-b5ef-48f4a756fdda tempest-ServersAdminNegativeTestJSON-1434965427 tempest-ServersAdminNegativeTestJSON-1434965427-project-member] Lock "8a4e18b6-55c0-4397-b570-27db4541e9b3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 214.118s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1465.145637] env[68906]: DEBUG nova.compute.manager [None req-c9c1fd81-2ddc-492d-b16c-1ea8a5a25b7e tempest-SecurityGroupsTestJSON-973572118 tempest-SecurityGroupsTestJSON-973572118-project-member] [instance: 2b688987-d4cf-4ebb-83c4-d5fa7f5bcbb9] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1465.169398] env[68906]: DEBUG nova.compute.manager [None req-c9c1fd81-2ddc-492d-b16c-1ea8a5a25b7e tempest-SecurityGroupsTestJSON-973572118 tempest-SecurityGroupsTestJSON-973572118-project-member] [instance: 2b688987-d4cf-4ebb-83c4-d5fa7f5bcbb9] Instance disappeared before build. {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1465.189641] env[68906]: DEBUG oslo_concurrency.lockutils [None req-c9c1fd81-2ddc-492d-b16c-1ea8a5a25b7e tempest-SecurityGroupsTestJSON-973572118 tempest-SecurityGroupsTestJSON-973572118-project-member] Lock "2b688987-d4cf-4ebb-83c4-d5fa7f5bcbb9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 207.641s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1465.198028] env[68906]: DEBUG nova.compute.manager [None req-b6863f75-a96e-4ba0-8871-761a25ce2a13 tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: 3ce59687-c677-40bd-8af4-c2f4b576e86e] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1465.223146] env[68906]: DEBUG nova.compute.manager [None req-b6863f75-a96e-4ba0-8871-761a25ce2a13 tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: 3ce59687-c677-40bd-8af4-c2f4b576e86e] Instance disappeared before build. {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1465.243447] env[68906]: DEBUG oslo_concurrency.lockutils [None req-b6863f75-a96e-4ba0-8871-761a25ce2a13 tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Lock "3ce59687-c677-40bd-8af4-c2f4b576e86e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 206.582s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1465.253682] env[68906]: DEBUG nova.compute.manager [None req-826cb636-5a60-4614-876a-92b085b28a4c tempest-ServerPasswordTestJSON-295808980 tempest-ServerPasswordTestJSON-295808980-project-member] [instance: 45c0d7ba-6d21-46d1-8bcb-0318bd93f885] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1465.279258] env[68906]: DEBUG nova.compute.manager [None req-826cb636-5a60-4614-876a-92b085b28a4c tempest-ServerPasswordTestJSON-295808980 tempest-ServerPasswordTestJSON-295808980-project-member] [instance: 45c0d7ba-6d21-46d1-8bcb-0318bd93f885] Instance disappeared before build. {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1465.302291] env[68906]: DEBUG oslo_concurrency.lockutils [None req-826cb636-5a60-4614-876a-92b085b28a4c tempest-ServerPasswordTestJSON-295808980 tempest-ServerPasswordTestJSON-295808980-project-member] Lock "45c0d7ba-6d21-46d1-8bcb-0318bd93f885" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 202.204s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1465.311379] env[68906]: DEBUG nova.compute.manager [None req-e8da611a-e036-4ed5-9519-7f53ab98e63d tempest-MultipleCreateTestJSON-422056473 tempest-MultipleCreateTestJSON-422056473-project-member] [instance: 75a4f8bc-09aa-4c9b-b705-fb84ddcf60ea] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1465.342672] env[68906]: DEBUG nova.compute.manager [None req-e8da611a-e036-4ed5-9519-7f53ab98e63d tempest-MultipleCreateTestJSON-422056473 tempest-MultipleCreateTestJSON-422056473-project-member] [instance: 75a4f8bc-09aa-4c9b-b705-fb84ddcf60ea] Instance disappeared before build. {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1465.368411] env[68906]: DEBUG oslo_concurrency.lockutils [None req-e8da611a-e036-4ed5-9519-7f53ab98e63d tempest-MultipleCreateTestJSON-422056473 tempest-MultipleCreateTestJSON-422056473-project-member] Lock "75a4f8bc-09aa-4c9b-b705-fb84ddcf60ea" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 196.235s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1465.378990] env[68906]: DEBUG nova.compute.manager [None req-e8da611a-e036-4ed5-9519-7f53ab98e63d tempest-MultipleCreateTestJSON-422056473 tempest-MultipleCreateTestJSON-422056473-project-member] [instance: ff99f1e3-9a4a-487e-afcb-6d8439a0491d] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1465.407022] env[68906]: DEBUG nova.compute.manager [None req-e8da611a-e036-4ed5-9519-7f53ab98e63d tempest-MultipleCreateTestJSON-422056473 tempest-MultipleCreateTestJSON-422056473-project-member] [instance: ff99f1e3-9a4a-487e-afcb-6d8439a0491d] Instance disappeared before build. {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1465.428540] env[68906]: DEBUG oslo_concurrency.lockutils [None req-e8da611a-e036-4ed5-9519-7f53ab98e63d tempest-MultipleCreateTestJSON-422056473 tempest-MultipleCreateTestJSON-422056473-project-member] Lock "ff99f1e3-9a4a-487e-afcb-6d8439a0491d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 196.265s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1465.437246] env[68906]: DEBUG nova.compute.manager [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1465.492070] env[68906]: DEBUG oslo_concurrency.lockutils [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1465.492324] env[68906]: DEBUG oslo_concurrency.lockutils [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1465.494019] env[68906]: INFO nova.compute.claims [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1465.759571] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-221e4c85-4ac0-41d2-9e94-27a0da9b1b7f {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1465.768023] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d73a93a5-dcc7-4335-ab30-58f0c1ed0739 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1465.796774] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a461652f-b8bf-463e-959c-2797c6ca087b {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1465.804175] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b0c34fd7-340e-402e-9a6c-17889c9e4486 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1465.818057] env[68906]: DEBUG nova.compute.provider_tree [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1465.826433] env[68906]: DEBUG nova.scheduler.client.report [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1465.839347] env[68906]: DEBUG oslo_concurrency.lockutils [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.347s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1465.839841] env[68906]: DEBUG nova.compute.manager [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Start building networks asynchronously for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1465.870803] env[68906]: DEBUG nova.compute.utils [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Using /dev/sd instead of None {{(pid=68906) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1465.872255] env[68906]: DEBUG nova.compute.manager [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Allocating IP information in the background. {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1465.872432] env[68906]: DEBUG nova.network.neutron [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] allocate_for_instance() {{(pid=68906) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1465.884636] env[68906]: DEBUG nova.compute.manager [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Start building block device mappings for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1465.934719] env[68906]: DEBUG nova.policy [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fa8acbdb3f304f67ba13b02e547844d1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '35ea959a162d451db5103b94bf7da26a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68906) authorize /opt/stack/nova/nova/policy.py:203}} [ 1465.953239] env[68906]: DEBUG nova.compute.manager [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Start spawning the instance on the hypervisor. {{(pid=68906) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1465.984850] env[68906]: DEBUG nova.virt.hardware [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T13:00:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T13:00:23Z,direct_url=,disk_format='vmdk',id=b1400c31-d33b-4e13-944f-4c645e62493e,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='1ae7bf3a375d41c6af5e7536af51ffd1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T13:00:24Z,virtual_size=,visibility=), allow threads: False {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1465.985119] env[68906]: DEBUG nova.virt.hardware [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Flavor limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1465.985278] env[68906]: DEBUG nova.virt.hardware [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Image limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1465.985456] env[68906]: DEBUG nova.virt.hardware [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Flavor pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1465.985604] env[68906]: DEBUG nova.virt.hardware [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Image pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1465.985749] env[68906]: DEBUG nova.virt.hardware [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1465.985956] env[68906]: DEBUG nova.virt.hardware [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1465.986493] env[68906]: DEBUG nova.virt.hardware [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1465.986709] env[68906]: DEBUG nova.virt.hardware [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Got 1 possible topologies {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1465.986882] env[68906]: DEBUG nova.virt.hardware [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1465.987072] env[68906]: DEBUG nova.virt.hardware [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1465.988489] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-decba846-ded6-4c3c-876d-c4427419405e {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1465.996661] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-46a2db00-8716-4b4d-807c-4db1430dea39 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1466.236533] env[68906]: DEBUG nova.network.neutron [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Successfully created port: 2a503d0e-acbe-42a6-8736-c0ce191f7ca9 {{(pid=68906) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1466.961406] env[68906]: DEBUG nova.compute.manager [req-0a034334-99fe-4c47-b246-e0d3384459d1 req-76c94f55-6255-49b6-bf81-524bf456c780 service nova] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Received event network-vif-plugged-2a503d0e-acbe-42a6-8736-c0ce191f7ca9 {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1466.961697] env[68906]: DEBUG oslo_concurrency.lockutils [req-0a034334-99fe-4c47-b246-e0d3384459d1 req-76c94f55-6255-49b6-bf81-524bf456c780 service nova] Acquiring lock "aed06616-d008-4695-b66e-9f40acf5ebd3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1466.961884] env[68906]: DEBUG oslo_concurrency.lockutils [req-0a034334-99fe-4c47-b246-e0d3384459d1 req-76c94f55-6255-49b6-bf81-524bf456c780 service nova] Lock "aed06616-d008-4695-b66e-9f40acf5ebd3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1466.962070] env[68906]: DEBUG oslo_concurrency.lockutils [req-0a034334-99fe-4c47-b246-e0d3384459d1 req-76c94f55-6255-49b6-bf81-524bf456c780 service nova] Lock "aed06616-d008-4695-b66e-9f40acf5ebd3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1466.962245] env[68906]: DEBUG nova.compute.manager [req-0a034334-99fe-4c47-b246-e0d3384459d1 req-76c94f55-6255-49b6-bf81-524bf456c780 service nova] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] No waiting events found dispatching network-vif-plugged-2a503d0e-acbe-42a6-8736-c0ce191f7ca9 {{(pid=68906) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1466.962412] env[68906]: WARNING nova.compute.manager [req-0a034334-99fe-4c47-b246-e0d3384459d1 req-76c94f55-6255-49b6-bf81-524bf456c780 service nova] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Received unexpected event network-vif-plugged-2a503d0e-acbe-42a6-8736-c0ce191f7ca9 for instance with vm_state building and task_state spawning. [ 1466.980516] env[68906]: DEBUG nova.network.neutron [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Successfully updated port: 2a503d0e-acbe-42a6-8736-c0ce191f7ca9 {{(pid=68906) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1466.990045] env[68906]: DEBUG oslo_concurrency.lockutils [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Acquiring lock "refresh_cache-aed06616-d008-4695-b66e-9f40acf5ebd3" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1466.990233] env[68906]: DEBUG oslo_concurrency.lockutils [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Acquired lock "refresh_cache-aed06616-d008-4695-b66e-9f40acf5ebd3" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1466.990338] env[68906]: DEBUG nova.network.neutron [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Building network info cache for instance {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1467.057121] env[68906]: DEBUG nova.network.neutron [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Instance cache missing network info. {{(pid=68906) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1467.305243] env[68906]: DEBUG nova.network.neutron [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Updating instance_info_cache with network_info: [{"id": "2a503d0e-acbe-42a6-8736-c0ce191f7ca9", "address": "fa:16:3e:0a:df:c8", "network": {"id": "42998f86-911a-4af7-93b7-ffe19e2cd70c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-563323785-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "35ea959a162d451db5103b94bf7da26a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ea4fe416-47a6-4542-b59d-8c71ab4d6503", "external-id": "nsx-vlan-transportzone-369", "segmentation_id": 369, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2a503d0e-ac", "ovs_interfaceid": "2a503d0e-acbe-42a6-8736-c0ce191f7ca9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1467.318424] env[68906]: DEBUG oslo_concurrency.lockutils [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Releasing lock "refresh_cache-aed06616-d008-4695-b66e-9f40acf5ebd3" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1467.318731] env[68906]: DEBUG nova.compute.manager [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Instance network_info: |[{"id": "2a503d0e-acbe-42a6-8736-c0ce191f7ca9", "address": "fa:16:3e:0a:df:c8", "network": {"id": "42998f86-911a-4af7-93b7-ffe19e2cd70c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-563323785-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "35ea959a162d451db5103b94bf7da26a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ea4fe416-47a6-4542-b59d-8c71ab4d6503", "external-id": "nsx-vlan-transportzone-369", "segmentation_id": 369, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2a503d0e-ac", "ovs_interfaceid": "2a503d0e-acbe-42a6-8736-c0ce191f7ca9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1467.319144] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:0a:df:c8', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ea4fe416-47a6-4542-b59d-8c71ab4d6503', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '2a503d0e-acbe-42a6-8736-c0ce191f7ca9', 'vif_model': 'vmxnet3'}] {{(pid=68906) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1467.326789] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Creating folder: Project (35ea959a162d451db5103b94bf7da26a). Parent ref: group-v694750. {{(pid=68906) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1467.327403] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-11731ebe-40b8-4ad2-a4be-aa4ea159c6db {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1467.339424] env[68906]: INFO nova.virt.vmwareapi.vm_util [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Created folder: Project (35ea959a162d451db5103b94bf7da26a) in parent group-v694750. [ 1467.339711] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Creating folder: Instances. Parent ref: group-v694832. {{(pid=68906) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1467.339957] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d19d2082-608a-4070-902c-ce995de1ea9c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1467.349705] env[68906]: INFO nova.virt.vmwareapi.vm_util [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Created folder: Instances in parent group-v694832. [ 1467.349942] env[68906]: DEBUG oslo.service.loopingcall [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1467.350147] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Creating VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1467.350358] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-f33d14f1-e369-425d-9803-c780e8322108 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1467.371017] env[68906]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1467.371017] env[68906]: value = "task-3475401" [ 1467.371017] env[68906]: _type = "Task" [ 1467.371017] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1467.380032] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475401, 'name': CreateVM_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1467.880300] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475401, 'name': CreateVM_Task, 'duration_secs': 0.312996} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1467.880470] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Created VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1467.881154] env[68906]: DEBUG oslo_concurrency.lockutils [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1467.881328] env[68906]: DEBUG oslo_concurrency.lockutils [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1467.881698] env[68906]: DEBUG oslo_concurrency.lockutils [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1467.881933] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-33cb573e-f872-4373-bede-a1a551d85c9f {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1467.886481] env[68906]: DEBUG oslo_vmware.api [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Waiting for the task: (returnval){ [ 1467.886481] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]5258811e-f420-c0e5-8aa5-7bce25890c33" [ 1467.886481] env[68906]: _type = "Task" [ 1467.886481] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1467.894685] env[68906]: DEBUG oslo_vmware.api [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]5258811e-f420-c0e5-8aa5-7bce25890c33, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1468.396202] env[68906]: DEBUG oslo_concurrency.lockutils [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1468.396524] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Processing image b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1468.396637] env[68906]: DEBUG oslo_concurrency.lockutils [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1468.992334] env[68906]: DEBUG nova.compute.manager [req-4fb01384-9675-4451-b8a4-952fb95649fe req-c87285a9-64df-45fc-ab57-8c5776402b17 service nova] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Received event network-changed-2a503d0e-acbe-42a6-8736-c0ce191f7ca9 {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1468.992435] env[68906]: DEBUG nova.compute.manager [req-4fb01384-9675-4451-b8a4-952fb95649fe req-c87285a9-64df-45fc-ab57-8c5776402b17 service nova] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Refreshing instance network info cache due to event network-changed-2a503d0e-acbe-42a6-8736-c0ce191f7ca9. {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1468.992658] env[68906]: DEBUG oslo_concurrency.lockutils [req-4fb01384-9675-4451-b8a4-952fb95649fe req-c87285a9-64df-45fc-ab57-8c5776402b17 service nova] Acquiring lock "refresh_cache-aed06616-d008-4695-b66e-9f40acf5ebd3" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1468.992823] env[68906]: DEBUG oslo_concurrency.lockutils [req-4fb01384-9675-4451-b8a4-952fb95649fe req-c87285a9-64df-45fc-ab57-8c5776402b17 service nova] Acquired lock "refresh_cache-aed06616-d008-4695-b66e-9f40acf5ebd3" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1468.992989] env[68906]: DEBUG nova.network.neutron [req-4fb01384-9675-4451-b8a4-952fb95649fe req-c87285a9-64df-45fc-ab57-8c5776402b17 service nova] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Refreshing network info cache for port 2a503d0e-acbe-42a6-8736-c0ce191f7ca9 {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1469.233088] env[68906]: DEBUG nova.network.neutron [req-4fb01384-9675-4451-b8a4-952fb95649fe req-c87285a9-64df-45fc-ab57-8c5776402b17 service nova] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Updated VIF entry in instance network info cache for port 2a503d0e-acbe-42a6-8736-c0ce191f7ca9. {{(pid=68906) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1469.233643] env[68906]: DEBUG nova.network.neutron [req-4fb01384-9675-4451-b8a4-952fb95649fe req-c87285a9-64df-45fc-ab57-8c5776402b17 service nova] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Updating instance_info_cache with network_info: [{"id": "2a503d0e-acbe-42a6-8736-c0ce191f7ca9", "address": "fa:16:3e:0a:df:c8", "network": {"id": "42998f86-911a-4af7-93b7-ffe19e2cd70c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-563323785-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "35ea959a162d451db5103b94bf7da26a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ea4fe416-47a6-4542-b59d-8c71ab4d6503", "external-id": "nsx-vlan-transportzone-369", "segmentation_id": 369, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2a503d0e-ac", "ovs_interfaceid": "2a503d0e-acbe-42a6-8736-c0ce191f7ca9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1469.245469] env[68906]: DEBUG oslo_concurrency.lockutils [req-4fb01384-9675-4451-b8a4-952fb95649fe req-c87285a9-64df-45fc-ab57-8c5776402b17 service nova] Releasing lock "refresh_cache-aed06616-d008-4695-b66e-9f40acf5ebd3" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1476.140658] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1476.141029] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Cleaning up deleted instances with incomplete migration {{(pid=68906) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 1481.141188] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1481.750195] env[68906]: DEBUG oslo_concurrency.lockutils [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Acquiring lock "736db39c-e5e5-4a54-b85a-aa5c703f432e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1481.750438] env[68906]: DEBUG oslo_concurrency.lockutils [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Lock "736db39c-e5e5-4a54-b85a-aa5c703f432e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1482.148226] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1482.368568] env[68906]: DEBUG oslo_concurrency.lockutils [None req-31d44fc1-fdd4-4a79-b25f-9da7cce5ce50 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Acquiring lock "aed06616-d008-4695-b66e-9f40acf5ebd3" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1485.142067] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1485.142067] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Starting heal instance info cache {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1485.142067] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Rebuilding the list of instances to heal {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1485.165628] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1485.165824] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1485.165918] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: db011373-7285-4882-8bce-d39cfa22fe80] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1485.166053] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1485.166182] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1485.166303] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1485.166425] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1485.166544] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 89171680-c76d-4826-9236-379542661ffb] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1485.166663] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1485.166782] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1485.166904] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Didn't find any instances for network info cache update. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1485.167418] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1486.140455] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1487.140323] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1487.140665] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68906) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1487.140665] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1487.140820] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Cleaning up deleted instances {{(pid=68906) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 1487.150575] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] There are 0 instances to clean {{(pid=68906) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 1488.453733] env[68906]: DEBUG oslo_concurrency.lockutils [None req-70df84c4-75b7-4312-8ee4-e32fe1fafc2b tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] Acquiring lock "7faa4c32-7572-4594-a760-e928607bf2b6" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1488.454123] env[68906]: DEBUG oslo_concurrency.lockutils [None req-70df84c4-75b7-4312-8ee4-e32fe1fafc2b tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] Lock "7faa4c32-7572-4594-a760-e928607bf2b6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1489.150642] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1490.136431] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1490.140148] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1490.140354] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager.update_available_resource {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1490.151291] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1490.151507] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1490.151677] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1490.151830] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68906) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1490.152988] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b9774e88-276b-4d82-b1f7-f7450489bebb {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1490.161531] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0afb7c98-94a1-420b-a48e-27c44e4ae6c0 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1490.175311] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-18bae57a-211a-4b46-924d-560a92562061 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1490.181504] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4605adc4-6c79-4122-808d-620f3c92a1cc {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1490.209724] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180970MB free_disk=93GB free_vcpus=48 pci_devices=None {{(pid=68906) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1490.209824] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1490.210048] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1490.352225] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1490.352399] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 4d36bb91-0cde-44cb-8706-d17740a9cf50 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1490.352533] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance db011373-7285-4882-8bce-d39cfa22fe80 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1490.352737] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 1fdb401a-ac25-4418-803c-fc0b2297f2d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1490.352892] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1490.353067] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance e0e595e3-e47e-4cf1-8977-f004eca942d1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1490.353164] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 7466df8a-59a9-49b9-bff7-c4efbeae3eee actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1490.353266] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 89171680-c76d-4826-9236-379542661ffb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1490.353378] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 9b884416-df89-4d8c-b2ab-0667db52a718 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1490.353491] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance aed06616-d008-4695-b66e-9f40acf5ebd3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1490.365785] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 17327bc3-433e-4006-93c7-e53714ed70c2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1490.376623] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 32f5b54d-30bf-4fe9-9622-3ff74344b3f3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1490.386671] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 922d81ba-c8d2-43ba-b1c5-f2943418d6a2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1490.396935] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 302e2275-a3ec-48c5-899e-6f385190bfe8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1490.407022] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance a59ab448-c4f1-4f54-be7a-7e204130f3f8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1490.415652] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 736db39c-e5e5-4a54-b85a-aa5c703f432e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1490.425274] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 7faa4c32-7572-4594-a760-e928607bf2b6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1490.425500] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1490.425646] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1490.442069] env[68906]: DEBUG nova.scheduler.client.report [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Refreshing inventories for resource provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1490.455413] env[68906]: DEBUG nova.scheduler.client.report [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Updating ProviderTree inventory for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1490.455624] env[68906]: DEBUG nova.compute.provider_tree [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Updating inventory in ProviderTree for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1490.466264] env[68906]: DEBUG nova.scheduler.client.report [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Refreshing aggregate associations for resource provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b, aggregates: None {{(pid=68906) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1490.482584] env[68906]: DEBUG nova.scheduler.client.report [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Refreshing trait associations for resource provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b, traits: COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ISO {{(pid=68906) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1490.657642] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2ac63289-1b20-40fc-98c8-5bd7a8d9bb29 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1490.665423] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-256ebb95-01a4-450a-9656-21d1509711d5 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1490.694647] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-21501d14-048c-4971-8ac1-37b3e82aa73b {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1490.701583] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dcd2d09c-092d-4336-867f-3a1fc3468087 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1490.714199] env[68906]: DEBUG nova.compute.provider_tree [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1490.722387] env[68906]: DEBUG nova.scheduler.client.report [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1490.735254] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68906) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1490.735427] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.525s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1512.786236] env[68906]: WARNING oslo_vmware.rw_handles [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1512.786236] env[68906]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1512.786236] env[68906]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1512.786236] env[68906]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1512.786236] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1512.786236] env[68906]: ERROR oslo_vmware.rw_handles response.begin() [ 1512.786236] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1512.786236] env[68906]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1512.786236] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1512.786236] env[68906]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1512.786236] env[68906]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1512.786236] env[68906]: ERROR oslo_vmware.rw_handles [ 1512.786960] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Downloaded image file data b1400c31-d33b-4e13-944f-4c645e62493e to vmware_temp/ea0d0c3b-80ed-4819-8393-c05c4a12471a/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1512.788621] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Caching image {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1512.788867] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Copying Virtual Disk [datastore2] vmware_temp/ea0d0c3b-80ed-4819-8393-c05c4a12471a/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk to [datastore2] vmware_temp/ea0d0c3b-80ed-4819-8393-c05c4a12471a/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk {{(pid=68906) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1512.789227] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-96cd9851-656d-4c0b-8fcb-0f377c7f1585 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1512.797545] env[68906]: DEBUG oslo_vmware.api [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Waiting for the task: (returnval){ [ 1512.797545] env[68906]: value = "task-3475402" [ 1512.797545] env[68906]: _type = "Task" [ 1512.797545] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1512.805440] env[68906]: DEBUG oslo_vmware.api [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Task: {'id': task-3475402, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1513.307899] env[68906]: DEBUG oslo_vmware.exceptions [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Fault InvalidArgument not matched. {{(pid=68906) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1513.308581] env[68906]: DEBUG oslo_concurrency.lockutils [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1513.308976] env[68906]: ERROR nova.compute.manager [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1513.308976] env[68906]: Faults: ['InvalidArgument'] [ 1513.308976] env[68906]: ERROR nova.compute.manager [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Traceback (most recent call last): [ 1513.308976] env[68906]: ERROR nova.compute.manager [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1513.308976] env[68906]: ERROR nova.compute.manager [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] yield resources [ 1513.308976] env[68906]: ERROR nova.compute.manager [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1513.308976] env[68906]: ERROR nova.compute.manager [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] self.driver.spawn(context, instance, image_meta, [ 1513.308976] env[68906]: ERROR nova.compute.manager [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1513.308976] env[68906]: ERROR nova.compute.manager [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1513.308976] env[68906]: ERROR nova.compute.manager [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1513.308976] env[68906]: ERROR nova.compute.manager [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] self._fetch_image_if_missing(context, vi) [ 1513.308976] env[68906]: ERROR nova.compute.manager [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1513.309335] env[68906]: ERROR nova.compute.manager [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] image_cache(vi, tmp_image_ds_loc) [ 1513.309335] env[68906]: ERROR nova.compute.manager [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1513.309335] env[68906]: ERROR nova.compute.manager [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] vm_util.copy_virtual_disk( [ 1513.309335] env[68906]: ERROR nova.compute.manager [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1513.309335] env[68906]: ERROR nova.compute.manager [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] session._wait_for_task(vmdk_copy_task) [ 1513.309335] env[68906]: ERROR nova.compute.manager [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1513.309335] env[68906]: ERROR nova.compute.manager [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] return self.wait_for_task(task_ref) [ 1513.309335] env[68906]: ERROR nova.compute.manager [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1513.309335] env[68906]: ERROR nova.compute.manager [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] return evt.wait() [ 1513.309335] env[68906]: ERROR nova.compute.manager [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1513.309335] env[68906]: ERROR nova.compute.manager [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] result = hub.switch() [ 1513.309335] env[68906]: ERROR nova.compute.manager [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1513.309335] env[68906]: ERROR nova.compute.manager [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] return self.greenlet.switch() [ 1513.309753] env[68906]: ERROR nova.compute.manager [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1513.309753] env[68906]: ERROR nova.compute.manager [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] self.f(*self.args, **self.kw) [ 1513.309753] env[68906]: ERROR nova.compute.manager [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1513.309753] env[68906]: ERROR nova.compute.manager [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] raise exceptions.translate_fault(task_info.error) [ 1513.309753] env[68906]: ERROR nova.compute.manager [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1513.309753] env[68906]: ERROR nova.compute.manager [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Faults: ['InvalidArgument'] [ 1513.309753] env[68906]: ERROR nova.compute.manager [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] [ 1513.309753] env[68906]: INFO nova.compute.manager [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Terminating instance [ 1513.310831] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1513.311057] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1513.311288] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f15fe6fe-4fbb-4194-9146-7fd0e17639fc {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1513.313395] env[68906]: DEBUG nova.compute.manager [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1513.313614] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1513.314324] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3701981d-6a55-45ba-ac5c-200692b2bfe8 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1513.321018] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Unregistering the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1513.321235] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-dfec753f-4ae3-49cf-a4ce-e0ad81b8207b {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1513.323315] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1513.323487] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68906) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1513.324450] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-51c7c892-d87a-4990-b798-a860039b683e {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1513.329054] env[68906]: DEBUG oslo_vmware.api [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Waiting for the task: (returnval){ [ 1513.329054] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]52ffd078-9f34-861f-393a-a6b091aba587" [ 1513.329054] env[68906]: _type = "Task" [ 1513.329054] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1513.336113] env[68906]: DEBUG oslo_vmware.api [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]52ffd078-9f34-861f-393a-a6b091aba587, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1513.393343] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Unregistered the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1513.393545] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Deleting contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1513.393722] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Deleting the datastore file [datastore2] 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1513.393965] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-7bf1f071-3717-4155-a386-3286bff36ced {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1513.399630] env[68906]: DEBUG oslo_vmware.api [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Waiting for the task: (returnval){ [ 1513.399630] env[68906]: value = "task-3475404" [ 1513.399630] env[68906]: _type = "Task" [ 1513.399630] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1513.406995] env[68906]: DEBUG oslo_vmware.api [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Task: {'id': task-3475404, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1513.838962] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Preparing fetch location {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1513.839289] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Creating directory with path [datastore2] vmware_temp/8357a756-252d-4158-a384-de001ddee4e4/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1513.839468] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-36cd20df-29b0-4abe-92ad-8651624af572 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1513.850650] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Created directory with path [datastore2] vmware_temp/8357a756-252d-4158-a384-de001ddee4e4/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1513.850831] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Fetch image to [datastore2] vmware_temp/8357a756-252d-4158-a384-de001ddee4e4/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1513.851011] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to [datastore2] vmware_temp/8357a756-252d-4158-a384-de001ddee4e4/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1513.851736] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d141989-2a0e-4a35-a5c3-699284bc58d9 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1513.858202] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd104808-f3e2-4663-99b3-f74f98ec0103 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1513.867014] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05cdec96-9d73-4b1e-bac1-148931f3a21f {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1513.898456] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4f59e4e-814a-40b3-9a47-61fdc71ab502 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1513.909084] env[68906]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-a9d0a359-5a3d-4098-b525-184d8852129d {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1513.910765] env[68906]: DEBUG oslo_vmware.api [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Task: {'id': task-3475404, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.077245} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1513.911009] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Deleted the datastore file {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1513.911200] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Deleted contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1513.911367] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1513.911540] env[68906]: INFO nova.compute.manager [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1513.913635] env[68906]: DEBUG nova.compute.claims [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Aborting claim: {{(pid=68906) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1513.913799] env[68906]: DEBUG oslo_concurrency.lockutils [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1513.914033] env[68906]: DEBUG oslo_concurrency.lockutils [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1513.930743] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1513.985065] env[68906]: DEBUG oslo_vmware.rw_handles [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/8357a756-252d-4158-a384-de001ddee4e4/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1514.045474] env[68906]: DEBUG oslo_vmware.rw_handles [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Completed reading data from the image iterator. {{(pid=68906) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1514.045474] env[68906]: DEBUG oslo_vmware.rw_handles [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/8357a756-252d-4158-a384-de001ddee4e4/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1514.202100] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a41381a-ac59-4561-bd67-cd250b7023fa {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1514.210333] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71935da9-07b8-4710-b6c2-16f3e06835e3 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1514.239559] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3ba7761-6778-4b3b-bfd9-ffc8db601825 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1514.246578] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-630da074-c190-4441-97c2-ce625a3b3110 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1514.259816] env[68906]: DEBUG nova.compute.provider_tree [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1514.271018] env[68906]: DEBUG nova.scheduler.client.report [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1514.283534] env[68906]: DEBUG oslo_concurrency.lockutils [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.369s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1514.284136] env[68906]: ERROR nova.compute.manager [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1514.284136] env[68906]: Faults: ['InvalidArgument'] [ 1514.284136] env[68906]: ERROR nova.compute.manager [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Traceback (most recent call last): [ 1514.284136] env[68906]: ERROR nova.compute.manager [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1514.284136] env[68906]: ERROR nova.compute.manager [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] self.driver.spawn(context, instance, image_meta, [ 1514.284136] env[68906]: ERROR nova.compute.manager [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1514.284136] env[68906]: ERROR nova.compute.manager [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1514.284136] env[68906]: ERROR nova.compute.manager [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1514.284136] env[68906]: ERROR nova.compute.manager [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] self._fetch_image_if_missing(context, vi) [ 1514.284136] env[68906]: ERROR nova.compute.manager [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1514.284136] env[68906]: ERROR nova.compute.manager [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] image_cache(vi, tmp_image_ds_loc) [ 1514.284136] env[68906]: ERROR nova.compute.manager [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1514.284477] env[68906]: ERROR nova.compute.manager [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] vm_util.copy_virtual_disk( [ 1514.284477] env[68906]: ERROR nova.compute.manager [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1514.284477] env[68906]: ERROR nova.compute.manager [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] session._wait_for_task(vmdk_copy_task) [ 1514.284477] env[68906]: ERROR nova.compute.manager [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1514.284477] env[68906]: ERROR nova.compute.manager [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] return self.wait_for_task(task_ref) [ 1514.284477] env[68906]: ERROR nova.compute.manager [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1514.284477] env[68906]: ERROR nova.compute.manager [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] return evt.wait() [ 1514.284477] env[68906]: ERROR nova.compute.manager [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1514.284477] env[68906]: ERROR nova.compute.manager [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] result = hub.switch() [ 1514.284477] env[68906]: ERROR nova.compute.manager [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1514.284477] env[68906]: ERROR nova.compute.manager [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] return self.greenlet.switch() [ 1514.284477] env[68906]: ERROR nova.compute.manager [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1514.284477] env[68906]: ERROR nova.compute.manager [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] self.f(*self.args, **self.kw) [ 1514.284781] env[68906]: ERROR nova.compute.manager [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1514.284781] env[68906]: ERROR nova.compute.manager [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] raise exceptions.translate_fault(task_info.error) [ 1514.284781] env[68906]: ERROR nova.compute.manager [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1514.284781] env[68906]: ERROR nova.compute.manager [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Faults: ['InvalidArgument'] [ 1514.284781] env[68906]: ERROR nova.compute.manager [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] [ 1514.284896] env[68906]: DEBUG nova.compute.utils [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] VimFaultException {{(pid=68906) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1514.286458] env[68906]: DEBUG nova.compute.manager [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Build of instance 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a was re-scheduled: A specified parameter was not correct: fileType [ 1514.286458] env[68906]: Faults: ['InvalidArgument'] {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1514.286889] env[68906]: DEBUG nova.compute.manager [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Unplugging VIFs for instance {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1514.287100] env[68906]: DEBUG nova.compute.manager [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1514.287326] env[68906]: DEBUG nova.compute.manager [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1514.287524] env[68906]: DEBUG nova.network.neutron [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1514.678217] env[68906]: DEBUG nova.network.neutron [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1514.693455] env[68906]: INFO nova.compute.manager [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Took 0.41 seconds to deallocate network for instance. [ 1514.787527] env[68906]: INFO nova.scheduler.client.report [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Deleted allocations for instance 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a [ 1514.808772] env[68906]: DEBUG oslo_concurrency.lockutils [None req-0a3299be-e4a9-4311-8e53-c592324fe331 tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Lock "9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 635.189s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1514.810087] env[68906]: DEBUG oslo_concurrency.lockutils [None req-68edea35-7d38-466f-87ca-34c780500d3b tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Lock "9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 438.406s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1514.810322] env[68906]: DEBUG oslo_concurrency.lockutils [None req-68edea35-7d38-466f-87ca-34c780500d3b tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Acquiring lock "9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1514.810562] env[68906]: DEBUG oslo_concurrency.lockutils [None req-68edea35-7d38-466f-87ca-34c780500d3b tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Lock "9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1514.810739] env[68906]: DEBUG oslo_concurrency.lockutils [None req-68edea35-7d38-466f-87ca-34c780500d3b tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Lock "9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1514.813593] env[68906]: INFO nova.compute.manager [None req-68edea35-7d38-466f-87ca-34c780500d3b tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Terminating instance [ 1514.814320] env[68906]: DEBUG oslo_concurrency.lockutils [None req-68edea35-7d38-466f-87ca-34c780500d3b tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Acquiring lock "refresh_cache-9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1514.814550] env[68906]: DEBUG oslo_concurrency.lockutils [None req-68edea35-7d38-466f-87ca-34c780500d3b tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Acquired lock "refresh_cache-9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1514.815010] env[68906]: DEBUG nova.network.neutron [None req-68edea35-7d38-466f-87ca-34c780500d3b tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Building network info cache for instance {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1514.820142] env[68906]: DEBUG nova.compute.manager [None req-18f49ac7-7eb5-48de-bf53-912002f6914c tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] [instance: c3386804-6ed9-46fe-b26d-3b5aae52c84b] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1514.846275] env[68906]: DEBUG nova.compute.manager [None req-18f49ac7-7eb5-48de-bf53-912002f6914c tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] [instance: c3386804-6ed9-46fe-b26d-3b5aae52c84b] Instance disappeared before build. {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1514.852196] env[68906]: DEBUG nova.network.neutron [None req-68edea35-7d38-466f-87ca-34c780500d3b tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Instance cache missing network info. {{(pid=68906) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1514.869748] env[68906]: DEBUG oslo_concurrency.lockutils [None req-18f49ac7-7eb5-48de-bf53-912002f6914c tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] Lock "c3386804-6ed9-46fe-b26d-3b5aae52c84b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 223.047s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1514.881164] env[68906]: DEBUG nova.compute.manager [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1514.930426] env[68906]: DEBUG oslo_concurrency.lockutils [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1514.930735] env[68906]: DEBUG oslo_concurrency.lockutils [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1514.932283] env[68906]: INFO nova.compute.claims [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1515.124990] env[68906]: DEBUG nova.network.neutron [None req-68edea35-7d38-466f-87ca-34c780500d3b tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1515.134359] env[68906]: DEBUG oslo_concurrency.lockutils [None req-68edea35-7d38-466f-87ca-34c780500d3b tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Releasing lock "refresh_cache-9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1515.134951] env[68906]: DEBUG nova.compute.manager [None req-68edea35-7d38-466f-87ca-34c780500d3b tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1515.135165] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-68edea35-7d38-466f-87ca-34c780500d3b tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1515.137877] env[68906]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-9898357d-6915-4389-91e2-62f6b92afadf {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1515.147194] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-10f06e47-89ec-4796-a038-afb3cac86be7 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1515.179690] env[68906]: WARNING nova.virt.vmwareapi.vmops [None req-68edea35-7d38-466f-87ca-34c780500d3b tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a could not be found. [ 1515.179817] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-68edea35-7d38-466f-87ca-34c780500d3b tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1515.180013] env[68906]: INFO nova.compute.manager [None req-68edea35-7d38-466f-87ca-34c780500d3b tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1515.180263] env[68906]: DEBUG oslo.service.loopingcall [None req-68edea35-7d38-466f-87ca-34c780500d3b tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1515.182448] env[68906]: DEBUG nova.compute.manager [-] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1515.182609] env[68906]: DEBUG nova.network.neutron [-] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1515.193614] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f82bfec5-7bd9-4290-89d1-cd883f3acecc {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1515.200837] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a0ada9e3-ff42-44b0-8899-9b291d912678 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1515.204155] env[68906]: DEBUG nova.network.neutron [-] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Instance cache missing network info. {{(pid=68906) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1515.233405] env[68906]: DEBUG nova.network.neutron [-] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1515.234879] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-97c794e1-535a-4628-b961-1a155b22038f {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1515.242583] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-457babcf-692b-4207-a02b-300d87e6f6f2 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1515.246996] env[68906]: INFO nova.compute.manager [-] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] Took 0.06 seconds to deallocate network for instance. [ 1515.257270] env[68906]: DEBUG nova.compute.provider_tree [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1515.263993] env[68906]: DEBUG nova.scheduler.client.report [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1515.276499] env[68906]: DEBUG oslo_concurrency.lockutils [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.346s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1515.276992] env[68906]: DEBUG nova.compute.manager [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Start building networks asynchronously for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1515.315856] env[68906]: DEBUG nova.compute.utils [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Using /dev/sd instead of None {{(pid=68906) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1515.318759] env[68906]: DEBUG nova.compute.manager [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Allocating IP information in the background. {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1515.318931] env[68906]: DEBUG nova.network.neutron [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] allocate_for_instance() {{(pid=68906) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1515.331483] env[68906]: DEBUG nova.compute.manager [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Start building block device mappings for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1515.371818] env[68906]: DEBUG oslo_concurrency.lockutils [None req-68edea35-7d38-466f-87ca-34c780500d3b tempest-AttachInterfacesUnderV243Test-365796744 tempest-AttachInterfacesUnderV243Test-365796744-project-member] Lock "9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.562s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1515.372816] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 304.420s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1515.373024] env[68906]: INFO nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a] During sync_power_state the instance has a pending task (deleting). Skip. [ 1515.373284] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "9bd17d9a-2f34-4e99-91b1-a7c3fbc1f29a" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1515.386236] env[68906]: DEBUG nova.policy [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c08a6c439ba94d18b742a133848aaaae', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0e206dedfb584e219a7f5dd633032515', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68906) authorize /opt/stack/nova/nova/policy.py:203}} [ 1515.410221] env[68906]: DEBUG nova.compute.manager [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Start spawning the instance on the hypervisor. {{(pid=68906) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1515.434805] env[68906]: DEBUG nova.virt.hardware [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T13:00:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T13:00:23Z,direct_url=,disk_format='vmdk',id=b1400c31-d33b-4e13-944f-4c645e62493e,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='1ae7bf3a375d41c6af5e7536af51ffd1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T13:00:24Z,virtual_size=,visibility=), allow threads: False {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1515.437165] env[68906]: DEBUG nova.virt.hardware [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Flavor limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1515.437393] env[68906]: DEBUG nova.virt.hardware [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Image limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1515.437599] env[68906]: DEBUG nova.virt.hardware [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Flavor pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1515.437756] env[68906]: DEBUG nova.virt.hardware [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Image pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1515.437909] env[68906]: DEBUG nova.virt.hardware [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1515.438150] env[68906]: DEBUG nova.virt.hardware [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1515.438319] env[68906]: DEBUG nova.virt.hardware [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1515.438491] env[68906]: DEBUG nova.virt.hardware [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Got 1 possible topologies {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1515.438660] env[68906]: DEBUG nova.virt.hardware [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1515.438841] env[68906]: DEBUG nova.virt.hardware [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1515.439989] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c75bd12f-8355-48aa-99dc-52d3124e63f5 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1515.448659] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b0204c5-394a-474a-a5e3-6c21ce23dc4e {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1515.726086] env[68906]: DEBUG nova.network.neutron [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Successfully created port: a0019d5a-1ca5-4191-bb96-a5a4798040c8 {{(pid=68906) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1516.276886] env[68906]: DEBUG nova.network.neutron [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Successfully updated port: a0019d5a-1ca5-4191-bb96-a5a4798040c8 {{(pid=68906) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1516.290210] env[68906]: DEBUG oslo_concurrency.lockutils [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Acquiring lock "refresh_cache-17327bc3-433e-4006-93c7-e53714ed70c2" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1516.290336] env[68906]: DEBUG oslo_concurrency.lockutils [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Acquired lock "refresh_cache-17327bc3-433e-4006-93c7-e53714ed70c2" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1516.290498] env[68906]: DEBUG nova.network.neutron [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Building network info cache for instance {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1516.326405] env[68906]: DEBUG nova.network.neutron [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Instance cache missing network info. {{(pid=68906) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1516.504984] env[68906]: DEBUG nova.network.neutron [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Updating instance_info_cache with network_info: [{"id": "a0019d5a-1ca5-4191-bb96-a5a4798040c8", "address": "fa:16:3e:ad:21:25", "network": {"id": "c9025f67-c9f7-4312-b2bd-5fbb06647b07", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-9371784-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0e206dedfb584e219a7f5dd633032515", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f16a5584-aed0-4df4-820b-5e7f15977265", "external-id": "cl2-zone-495", "segmentation_id": 495, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa0019d5a-1c", "ovs_interfaceid": "a0019d5a-1ca5-4191-bb96-a5a4798040c8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1516.518974] env[68906]: DEBUG oslo_concurrency.lockutils [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Releasing lock "refresh_cache-17327bc3-433e-4006-93c7-e53714ed70c2" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1516.519294] env[68906]: DEBUG nova.compute.manager [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Instance network_info: |[{"id": "a0019d5a-1ca5-4191-bb96-a5a4798040c8", "address": "fa:16:3e:ad:21:25", "network": {"id": "c9025f67-c9f7-4312-b2bd-5fbb06647b07", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-9371784-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0e206dedfb584e219a7f5dd633032515", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f16a5584-aed0-4df4-820b-5e7f15977265", "external-id": "cl2-zone-495", "segmentation_id": 495, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa0019d5a-1c", "ovs_interfaceid": "a0019d5a-1ca5-4191-bb96-a5a4798040c8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1516.519723] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ad:21:25', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'f16a5584-aed0-4df4-820b-5e7f15977265', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'a0019d5a-1ca5-4191-bb96-a5a4798040c8', 'vif_model': 'vmxnet3'}] {{(pid=68906) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1516.527396] env[68906]: DEBUG oslo.service.loopingcall [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1516.527859] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Creating VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1516.528112] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-892272b3-0b8d-48ea-908c-a29630bc005b {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1516.548934] env[68906]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1516.548934] env[68906]: value = "task-3475405" [ 1516.548934] env[68906]: _type = "Task" [ 1516.548934] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1516.556580] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475405, 'name': CreateVM_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1516.732885] env[68906]: DEBUG nova.compute.manager [req-b6d1e791-c864-433a-ac56-2589d7b7b5ce req-b3d4bf60-59eb-4adc-93c8-f2ac25448f43 service nova] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Received event network-vif-plugged-a0019d5a-1ca5-4191-bb96-a5a4798040c8 {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1516.733122] env[68906]: DEBUG oslo_concurrency.lockutils [req-b6d1e791-c864-433a-ac56-2589d7b7b5ce req-b3d4bf60-59eb-4adc-93c8-f2ac25448f43 service nova] Acquiring lock "17327bc3-433e-4006-93c7-e53714ed70c2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1516.733312] env[68906]: DEBUG oslo_concurrency.lockutils [req-b6d1e791-c864-433a-ac56-2589d7b7b5ce req-b3d4bf60-59eb-4adc-93c8-f2ac25448f43 service nova] Lock "17327bc3-433e-4006-93c7-e53714ed70c2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1516.733480] env[68906]: DEBUG oslo_concurrency.lockutils [req-b6d1e791-c864-433a-ac56-2589d7b7b5ce req-b3d4bf60-59eb-4adc-93c8-f2ac25448f43 service nova] Lock "17327bc3-433e-4006-93c7-e53714ed70c2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1516.733690] env[68906]: DEBUG nova.compute.manager [req-b6d1e791-c864-433a-ac56-2589d7b7b5ce req-b3d4bf60-59eb-4adc-93c8-f2ac25448f43 service nova] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] No waiting events found dispatching network-vif-plugged-a0019d5a-1ca5-4191-bb96-a5a4798040c8 {{(pid=68906) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1516.733853] env[68906]: WARNING nova.compute.manager [req-b6d1e791-c864-433a-ac56-2589d7b7b5ce req-b3d4bf60-59eb-4adc-93c8-f2ac25448f43 service nova] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Received unexpected event network-vif-plugged-a0019d5a-1ca5-4191-bb96-a5a4798040c8 for instance with vm_state building and task_state spawning. [ 1516.734016] env[68906]: DEBUG nova.compute.manager [req-b6d1e791-c864-433a-ac56-2589d7b7b5ce req-b3d4bf60-59eb-4adc-93c8-f2ac25448f43 service nova] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Received event network-changed-a0019d5a-1ca5-4191-bb96-a5a4798040c8 {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1516.734178] env[68906]: DEBUG nova.compute.manager [req-b6d1e791-c864-433a-ac56-2589d7b7b5ce req-b3d4bf60-59eb-4adc-93c8-f2ac25448f43 service nova] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Refreshing instance network info cache due to event network-changed-a0019d5a-1ca5-4191-bb96-a5a4798040c8. {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1516.734356] env[68906]: DEBUG oslo_concurrency.lockutils [req-b6d1e791-c864-433a-ac56-2589d7b7b5ce req-b3d4bf60-59eb-4adc-93c8-f2ac25448f43 service nova] Acquiring lock "refresh_cache-17327bc3-433e-4006-93c7-e53714ed70c2" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1516.734491] env[68906]: DEBUG oslo_concurrency.lockutils [req-b6d1e791-c864-433a-ac56-2589d7b7b5ce req-b3d4bf60-59eb-4adc-93c8-f2ac25448f43 service nova] Acquired lock "refresh_cache-17327bc3-433e-4006-93c7-e53714ed70c2" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1516.734645] env[68906]: DEBUG nova.network.neutron [req-b6d1e791-c864-433a-ac56-2589d7b7b5ce req-b3d4bf60-59eb-4adc-93c8-f2ac25448f43 service nova] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Refreshing network info cache for port a0019d5a-1ca5-4191-bb96-a5a4798040c8 {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1516.982522] env[68906]: DEBUG nova.network.neutron [req-b6d1e791-c864-433a-ac56-2589d7b7b5ce req-b3d4bf60-59eb-4adc-93c8-f2ac25448f43 service nova] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Updated VIF entry in instance network info cache for port a0019d5a-1ca5-4191-bb96-a5a4798040c8. {{(pid=68906) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1516.983082] env[68906]: DEBUG nova.network.neutron [req-b6d1e791-c864-433a-ac56-2589d7b7b5ce req-b3d4bf60-59eb-4adc-93c8-f2ac25448f43 service nova] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Updating instance_info_cache with network_info: [{"id": "a0019d5a-1ca5-4191-bb96-a5a4798040c8", "address": "fa:16:3e:ad:21:25", "network": {"id": "c9025f67-c9f7-4312-b2bd-5fbb06647b07", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-9371784-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0e206dedfb584e219a7f5dd633032515", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f16a5584-aed0-4df4-820b-5e7f15977265", "external-id": "cl2-zone-495", "segmentation_id": 495, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa0019d5a-1c", "ovs_interfaceid": "a0019d5a-1ca5-4191-bb96-a5a4798040c8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1516.994080] env[68906]: DEBUG oslo_concurrency.lockutils [req-b6d1e791-c864-433a-ac56-2589d7b7b5ce req-b3d4bf60-59eb-4adc-93c8-f2ac25448f43 service nova] Releasing lock "refresh_cache-17327bc3-433e-4006-93c7-e53714ed70c2" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1517.059431] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475405, 'name': CreateVM_Task, 'duration_secs': 0.38993} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1517.060918] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Created VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1517.060918] env[68906]: DEBUG oslo_concurrency.lockutils [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1517.060918] env[68906]: DEBUG oslo_concurrency.lockutils [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1517.060918] env[68906]: DEBUG oslo_concurrency.lockutils [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1517.061134] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-42be8e8f-ec1a-40f4-a2a8-020107b0633a {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1517.065494] env[68906]: DEBUG oslo_vmware.api [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Waiting for the task: (returnval){ [ 1517.065494] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]52fa4bf0-13ee-5aa2-5b70-d7854d6f6218" [ 1517.065494] env[68906]: _type = "Task" [ 1517.065494] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1517.072317] env[68906]: DEBUG oslo_vmware.api [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]52fa4bf0-13ee-5aa2-5b70-d7854d6f6218, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1517.574841] env[68906]: DEBUG oslo_concurrency.lockutils [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1517.575163] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Processing image b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1517.575369] env[68906]: DEBUG oslo_concurrency.lockutils [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1543.731453] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1543.754260] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1546.140454] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1546.140839] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Starting heal instance info cache {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1546.140839] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Rebuilding the list of instances to heal {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1546.161887] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1546.162075] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: db011373-7285-4882-8bce-d39cfa22fe80] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1546.162190] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1546.162310] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1546.162437] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1546.162561] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1546.162680] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 89171680-c76d-4826-9236-379542661ffb] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1546.162798] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1546.162916] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1546.163044] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1546.163167] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Didn't find any instances for network info cache update. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1546.163668] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1547.140101] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1549.140185] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1549.140502] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1549.140554] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68906) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1551.141858] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1552.135707] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1552.140552] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager.update_available_resource {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1552.153963] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1552.154235] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1552.154386] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1552.154525] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68906) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1552.155633] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-582f507a-f310-471a-a716-222a45a9fb04 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1552.164221] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e2bec6bd-d4e7-4480-a900-8ac41e2aca2f {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1552.178658] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-044c3a1b-216c-46a1-8512-a06f9e4da06d {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1552.185131] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-26e0980f-b36b-4ebe-9220-5bc99236c5e5 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1552.214243] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180941MB free_disk=93GB free_vcpus=48 pci_devices=None {{(pid=68906) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1552.214449] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1552.214623] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1552.287616] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 4d36bb91-0cde-44cb-8706-d17740a9cf50 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1552.287786] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance db011373-7285-4882-8bce-d39cfa22fe80 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1552.287919] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 1fdb401a-ac25-4418-803c-fc0b2297f2d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1552.288062] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1552.288188] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance e0e595e3-e47e-4cf1-8977-f004eca942d1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1552.288308] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 7466df8a-59a9-49b9-bff7-c4efbeae3eee actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1552.288423] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 89171680-c76d-4826-9236-379542661ffb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1552.288539] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 9b884416-df89-4d8c-b2ab-0667db52a718 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1552.288654] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance aed06616-d008-4695-b66e-9f40acf5ebd3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1552.288766] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 17327bc3-433e-4006-93c7-e53714ed70c2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1552.299682] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 32f5b54d-30bf-4fe9-9622-3ff74344b3f3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1552.310162] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 922d81ba-c8d2-43ba-b1c5-f2943418d6a2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1552.319770] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 302e2275-a3ec-48c5-899e-6f385190bfe8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1552.329325] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance a59ab448-c4f1-4f54-be7a-7e204130f3f8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1552.338380] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 736db39c-e5e5-4a54-b85a-aa5c703f432e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1552.347316] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 7faa4c32-7572-4594-a760-e928607bf2b6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1552.347533] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1552.347679] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1552.519112] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c6907deb-ccd0-4516-8bb5-a854ab62be66 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1552.526415] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bfc3577d-3e06-48c7-b614-0234a45fbbe9 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1552.555333] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-25d094a1-25b0-45f3-919a-4f5f7a56ee41 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1552.562941] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-570d57ed-71a8-421a-a4c0-1c51ede90f3b {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1552.576334] env[68906]: DEBUG nova.compute.provider_tree [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1552.584666] env[68906]: DEBUG nova.scheduler.client.report [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1552.599335] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68906) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1552.599518] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.385s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1560.791221] env[68906]: WARNING oslo_vmware.rw_handles [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1560.791221] env[68906]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1560.791221] env[68906]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1560.791221] env[68906]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1560.791221] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1560.791221] env[68906]: ERROR oslo_vmware.rw_handles response.begin() [ 1560.791221] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1560.791221] env[68906]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1560.791221] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1560.791221] env[68906]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1560.791221] env[68906]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1560.791221] env[68906]: ERROR oslo_vmware.rw_handles [ 1560.791221] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Downloaded image file data b1400c31-d33b-4e13-944f-4c645e62493e to vmware_temp/8357a756-252d-4158-a384-de001ddee4e4/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1560.793245] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Caching image {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1560.793521] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Copying Virtual Disk [datastore2] vmware_temp/8357a756-252d-4158-a384-de001ddee4e4/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk to [datastore2] vmware_temp/8357a756-252d-4158-a384-de001ddee4e4/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk {{(pid=68906) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1560.793847] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-c56ddb7b-8877-49c5-91cc-58dd12fd180f {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1560.802138] env[68906]: DEBUG oslo_vmware.api [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Waiting for the task: (returnval){ [ 1560.802138] env[68906]: value = "task-3475406" [ 1560.802138] env[68906]: _type = "Task" [ 1560.802138] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1560.810075] env[68906]: DEBUG oslo_vmware.api [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Task: {'id': task-3475406, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1561.313188] env[68906]: DEBUG oslo_vmware.exceptions [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Fault InvalidArgument not matched. {{(pid=68906) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1561.313494] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1561.314097] env[68906]: ERROR nova.compute.manager [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1561.314097] env[68906]: Faults: ['InvalidArgument'] [ 1561.314097] env[68906]: ERROR nova.compute.manager [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Traceback (most recent call last): [ 1561.314097] env[68906]: ERROR nova.compute.manager [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1561.314097] env[68906]: ERROR nova.compute.manager [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] yield resources [ 1561.314097] env[68906]: ERROR nova.compute.manager [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1561.314097] env[68906]: ERROR nova.compute.manager [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] self.driver.spawn(context, instance, image_meta, [ 1561.314097] env[68906]: ERROR nova.compute.manager [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1561.314097] env[68906]: ERROR nova.compute.manager [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1561.314097] env[68906]: ERROR nova.compute.manager [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1561.314097] env[68906]: ERROR nova.compute.manager [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] self._fetch_image_if_missing(context, vi) [ 1561.314097] env[68906]: ERROR nova.compute.manager [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1561.314662] env[68906]: ERROR nova.compute.manager [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] image_cache(vi, tmp_image_ds_loc) [ 1561.314662] env[68906]: ERROR nova.compute.manager [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1561.314662] env[68906]: ERROR nova.compute.manager [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] vm_util.copy_virtual_disk( [ 1561.314662] env[68906]: ERROR nova.compute.manager [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1561.314662] env[68906]: ERROR nova.compute.manager [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] session._wait_for_task(vmdk_copy_task) [ 1561.314662] env[68906]: ERROR nova.compute.manager [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1561.314662] env[68906]: ERROR nova.compute.manager [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] return self.wait_for_task(task_ref) [ 1561.314662] env[68906]: ERROR nova.compute.manager [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1561.314662] env[68906]: ERROR nova.compute.manager [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] return evt.wait() [ 1561.314662] env[68906]: ERROR nova.compute.manager [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1561.314662] env[68906]: ERROR nova.compute.manager [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] result = hub.switch() [ 1561.314662] env[68906]: ERROR nova.compute.manager [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1561.314662] env[68906]: ERROR nova.compute.manager [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] return self.greenlet.switch() [ 1561.315217] env[68906]: ERROR nova.compute.manager [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1561.315217] env[68906]: ERROR nova.compute.manager [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] self.f(*self.args, **self.kw) [ 1561.315217] env[68906]: ERROR nova.compute.manager [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1561.315217] env[68906]: ERROR nova.compute.manager [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] raise exceptions.translate_fault(task_info.error) [ 1561.315217] env[68906]: ERROR nova.compute.manager [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1561.315217] env[68906]: ERROR nova.compute.manager [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Faults: ['InvalidArgument'] [ 1561.315217] env[68906]: ERROR nova.compute.manager [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] [ 1561.315217] env[68906]: INFO nova.compute.manager [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Terminating instance [ 1561.316339] env[68906]: DEBUG oslo_concurrency.lockutils [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1561.316339] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1561.317116] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3c55ee60-6d83-49ca-a4ae-47a4668c3618 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1561.320217] env[68906]: DEBUG nova.compute.manager [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1561.320678] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1561.321401] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-04c4458a-0e8c-424e-a4f3-ef33b8f5f137 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1561.328176] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Unregistering the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1561.328409] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-edc2b34b-da0f-4656-a05b-8a08ec53ae43 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1561.330664] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1561.330839] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68906) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1561.331822] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a45c9076-3be8-4b66-a509-ce617596fbc4 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1561.336724] env[68906]: DEBUG oslo_vmware.api [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Waiting for the task: (returnval){ [ 1561.336724] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]52c65f25-4c65-8687-e96c-39e0865c67f3" [ 1561.336724] env[68906]: _type = "Task" [ 1561.336724] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1561.343635] env[68906]: DEBUG oslo_vmware.api [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]52c65f25-4c65-8687-e96c-39e0865c67f3, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1561.398452] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Unregistered the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1561.398677] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Deleting contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1561.398869] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Deleting the datastore file [datastore2] 4d36bb91-0cde-44cb-8706-d17740a9cf50 {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1561.399188] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-554a4907-13d4-400f-a164-6121141add8d {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1561.405032] env[68906]: DEBUG oslo_vmware.api [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Waiting for the task: (returnval){ [ 1561.405032] env[68906]: value = "task-3475408" [ 1561.405032] env[68906]: _type = "Task" [ 1561.405032] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1561.412703] env[68906]: DEBUG oslo_vmware.api [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Task: {'id': task-3475408, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1561.847595] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: db011373-7285-4882-8bce-d39cfa22fe80] Preparing fetch location {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1561.847897] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Creating directory with path [datastore2] vmware_temp/e01e2753-288b-4e2b-9222-4184c7edcbf8/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1561.848099] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c50c7233-29c0-4034-ae9a-2b5289cedd9f {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1561.859592] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Created directory with path [datastore2] vmware_temp/e01e2753-288b-4e2b-9222-4184c7edcbf8/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1561.859789] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: db011373-7285-4882-8bce-d39cfa22fe80] Fetch image to [datastore2] vmware_temp/e01e2753-288b-4e2b-9222-4184c7edcbf8/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1561.859958] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: db011373-7285-4882-8bce-d39cfa22fe80] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to [datastore2] vmware_temp/e01e2753-288b-4e2b-9222-4184c7edcbf8/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1561.860730] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-685df8f9-9605-47c8-a84e-3b4d81daaf54 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1561.867295] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d01a650e-9b91-4db3-8d84-609e0f7677ce {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1561.876089] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca795cf4-4a24-48ff-922c-2f4eef2a88cb {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1561.908368] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a74d6c2-1794-4b07-ae71-15af7fd0c392 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1561.915053] env[68906]: DEBUG oslo_vmware.api [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Task: {'id': task-3475408, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.083902} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1561.916460] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Deleted the datastore file {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1561.916662] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Deleted contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1561.916830] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1561.917013] env[68906]: INFO nova.compute.manager [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1561.918726] env[68906]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-abe6f44b-96d9-437e-8fbb-2e141e1befc4 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1561.920548] env[68906]: DEBUG nova.compute.claims [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Aborting claim: {{(pid=68906) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1561.920718] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1561.920932] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1561.946536] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: db011373-7285-4882-8bce-d39cfa22fe80] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1561.996387] env[68906]: DEBUG oslo_vmware.rw_handles [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e01e2753-288b-4e2b-9222-4184c7edcbf8/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1562.055464] env[68906]: DEBUG oslo_vmware.rw_handles [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Completed reading data from the image iterator. {{(pid=68906) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1562.055657] env[68906]: DEBUG oslo_vmware.rw_handles [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e01e2753-288b-4e2b-9222-4184c7edcbf8/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1562.195343] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a4feab2-84e1-44d5-8dea-27d07ddb26dd {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1562.203073] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd92102c-d091-4561-bc89-c7682ddc2530 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1562.232758] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-938174a5-ddb8-4d5c-af32-b82519dfedc6 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1562.239617] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d683d8d9-2d33-4bc7-af86-1a614e721baf {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1562.252531] env[68906]: DEBUG nova.compute.provider_tree [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1562.261095] env[68906]: DEBUG nova.scheduler.client.report [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1562.274357] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.353s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1562.274901] env[68906]: ERROR nova.compute.manager [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1562.274901] env[68906]: Faults: ['InvalidArgument'] [ 1562.274901] env[68906]: ERROR nova.compute.manager [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Traceback (most recent call last): [ 1562.274901] env[68906]: ERROR nova.compute.manager [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1562.274901] env[68906]: ERROR nova.compute.manager [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] self.driver.spawn(context, instance, image_meta, [ 1562.274901] env[68906]: ERROR nova.compute.manager [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1562.274901] env[68906]: ERROR nova.compute.manager [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1562.274901] env[68906]: ERROR nova.compute.manager [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1562.274901] env[68906]: ERROR nova.compute.manager [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] self._fetch_image_if_missing(context, vi) [ 1562.274901] env[68906]: ERROR nova.compute.manager [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1562.274901] env[68906]: ERROR nova.compute.manager [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] image_cache(vi, tmp_image_ds_loc) [ 1562.274901] env[68906]: ERROR nova.compute.manager [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1562.275345] env[68906]: ERROR nova.compute.manager [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] vm_util.copy_virtual_disk( [ 1562.275345] env[68906]: ERROR nova.compute.manager [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1562.275345] env[68906]: ERROR nova.compute.manager [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] session._wait_for_task(vmdk_copy_task) [ 1562.275345] env[68906]: ERROR nova.compute.manager [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1562.275345] env[68906]: ERROR nova.compute.manager [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] return self.wait_for_task(task_ref) [ 1562.275345] env[68906]: ERROR nova.compute.manager [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1562.275345] env[68906]: ERROR nova.compute.manager [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] return evt.wait() [ 1562.275345] env[68906]: ERROR nova.compute.manager [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1562.275345] env[68906]: ERROR nova.compute.manager [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] result = hub.switch() [ 1562.275345] env[68906]: ERROR nova.compute.manager [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1562.275345] env[68906]: ERROR nova.compute.manager [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] return self.greenlet.switch() [ 1562.275345] env[68906]: ERROR nova.compute.manager [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1562.275345] env[68906]: ERROR nova.compute.manager [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] self.f(*self.args, **self.kw) [ 1562.275751] env[68906]: ERROR nova.compute.manager [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1562.275751] env[68906]: ERROR nova.compute.manager [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] raise exceptions.translate_fault(task_info.error) [ 1562.275751] env[68906]: ERROR nova.compute.manager [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1562.275751] env[68906]: ERROR nova.compute.manager [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Faults: ['InvalidArgument'] [ 1562.275751] env[68906]: ERROR nova.compute.manager [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] [ 1562.275751] env[68906]: DEBUG nova.compute.utils [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] VimFaultException {{(pid=68906) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1562.276867] env[68906]: DEBUG nova.compute.manager [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Build of instance 4d36bb91-0cde-44cb-8706-d17740a9cf50 was re-scheduled: A specified parameter was not correct: fileType [ 1562.276867] env[68906]: Faults: ['InvalidArgument'] {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1562.277248] env[68906]: DEBUG nova.compute.manager [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Unplugging VIFs for instance {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1562.277422] env[68906]: DEBUG nova.compute.manager [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1562.277610] env[68906]: DEBUG nova.compute.manager [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1562.277827] env[68906]: DEBUG nova.network.neutron [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1562.676497] env[68906]: DEBUG nova.network.neutron [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1562.693188] env[68906]: INFO nova.compute.manager [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Took 0.42 seconds to deallocate network for instance. [ 1562.790387] env[68906]: INFO nova.scheduler.client.report [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Deleted allocations for instance 4d36bb91-0cde-44cb-8706-d17740a9cf50 [ 1562.814019] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bf384869-204f-4e55-8c64-1babf5601dfe tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Lock "4d36bb91-0cde-44cb-8706-d17740a9cf50" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 627.768s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1562.815025] env[68906]: DEBUG oslo_concurrency.lockutils [None req-f7b75175-bd21-425d-83cf-3460875241d7 tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Lock "4d36bb91-0cde-44cb-8706-d17740a9cf50" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 431.489s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1562.815254] env[68906]: DEBUG oslo_concurrency.lockutils [None req-f7b75175-bd21-425d-83cf-3460875241d7 tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Acquiring lock "4d36bb91-0cde-44cb-8706-d17740a9cf50-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1562.815464] env[68906]: DEBUG oslo_concurrency.lockutils [None req-f7b75175-bd21-425d-83cf-3460875241d7 tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Lock "4d36bb91-0cde-44cb-8706-d17740a9cf50-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1562.815633] env[68906]: DEBUG oslo_concurrency.lockutils [None req-f7b75175-bd21-425d-83cf-3460875241d7 tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Lock "4d36bb91-0cde-44cb-8706-d17740a9cf50-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1562.817534] env[68906]: INFO nova.compute.manager [None req-f7b75175-bd21-425d-83cf-3460875241d7 tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Terminating instance [ 1562.819280] env[68906]: DEBUG nova.compute.manager [None req-f7b75175-bd21-425d-83cf-3460875241d7 tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1562.819429] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-f7b75175-bd21-425d-83cf-3460875241d7 tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1562.820109] env[68906]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-7ded33f0-b547-4606-bb1f-ff55c92b7c83 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1562.829084] env[68906]: DEBUG nova.compute.manager [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1562.834041] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dbef85a5-0b68-4d62-a2d9-66c02cd0cfcc {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1562.862963] env[68906]: WARNING nova.virt.vmwareapi.vmops [None req-f7b75175-bd21-425d-83cf-3460875241d7 tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 4d36bb91-0cde-44cb-8706-d17740a9cf50 could not be found. [ 1562.863612] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-f7b75175-bd21-425d-83cf-3460875241d7 tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1562.863612] env[68906]: INFO nova.compute.manager [None req-f7b75175-bd21-425d-83cf-3460875241d7 tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1562.863612] env[68906]: DEBUG oslo.service.loopingcall [None req-f7b75175-bd21-425d-83cf-3460875241d7 tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1562.864367] env[68906]: DEBUG nova.compute.manager [-] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1562.864472] env[68906]: DEBUG nova.network.neutron [-] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1562.881461] env[68906]: DEBUG oslo_concurrency.lockutils [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1562.881707] env[68906]: DEBUG oslo_concurrency.lockutils [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1562.883266] env[68906]: INFO nova.compute.claims [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1562.894499] env[68906]: DEBUG nova.network.neutron [-] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1562.905683] env[68906]: INFO nova.compute.manager [-] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] Took 0.04 seconds to deallocate network for instance. [ 1562.990676] env[68906]: DEBUG oslo_concurrency.lockutils [None req-f7b75175-bd21-425d-83cf-3460875241d7 tempest-ServerMetadataTestJSON-461341781 tempest-ServerMetadataTestJSON-461341781-project-member] Lock "4d36bb91-0cde-44cb-8706-d17740a9cf50" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.176s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1562.991512] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "4d36bb91-0cde-44cb-8706-d17740a9cf50" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 352.039s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1562.991700] env[68906]: INFO nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 4d36bb91-0cde-44cb-8706-d17740a9cf50] During sync_power_state the instance has a pending task (deleting). Skip. [ 1562.991873] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "4d36bb91-0cde-44cb-8706-d17740a9cf50" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1563.104215] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-934be897-96af-4c45-ac3f-26447665f562 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1563.111623] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-deff949f-0fd0-4399-ac42-404d016306a1 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1563.142183] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd5ff66a-1fa4-44b8-a351-9ea851501906 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1563.149050] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9c3aea96-5f50-4355-a352-8916db6a98ee {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1563.162260] env[68906]: DEBUG nova.compute.provider_tree [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1563.171820] env[68906]: DEBUG nova.scheduler.client.report [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1563.185694] env[68906]: DEBUG oslo_concurrency.lockutils [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.304s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1563.186172] env[68906]: DEBUG nova.compute.manager [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Start building networks asynchronously for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1563.220986] env[68906]: DEBUG nova.compute.utils [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Using /dev/sd instead of None {{(pid=68906) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1563.222411] env[68906]: DEBUG nova.compute.manager [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Allocating IP information in the background. {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1563.222583] env[68906]: DEBUG nova.network.neutron [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] allocate_for_instance() {{(pid=68906) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1563.231655] env[68906]: DEBUG nova.compute.manager [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Start building block device mappings for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1563.287597] env[68906]: DEBUG nova.policy [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e208107293fd4f82af1f396d43464b69', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '90f212f7916446919081fcdc0527ebb0', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68906) authorize /opt/stack/nova/nova/policy.py:203}} [ 1563.297624] env[68906]: DEBUG nova.compute.manager [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Start spawning the instance on the hypervisor. {{(pid=68906) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1563.323031] env[68906]: DEBUG nova.virt.hardware [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T13:00:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T13:00:23Z,direct_url=,disk_format='vmdk',id=b1400c31-d33b-4e13-944f-4c645e62493e,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='1ae7bf3a375d41c6af5e7536af51ffd1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T13:00:24Z,virtual_size=,visibility=), allow threads: False {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1563.323031] env[68906]: DEBUG nova.virt.hardware [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Flavor limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1563.323031] env[68906]: DEBUG nova.virt.hardware [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Image limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1563.323256] env[68906]: DEBUG nova.virt.hardware [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Flavor pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1563.323256] env[68906]: DEBUG nova.virt.hardware [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Image pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1563.323256] env[68906]: DEBUG nova.virt.hardware [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1563.323439] env[68906]: DEBUG nova.virt.hardware [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1563.323599] env[68906]: DEBUG nova.virt.hardware [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1563.323808] env[68906]: DEBUG nova.virt.hardware [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Got 1 possible topologies {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1563.323981] env[68906]: DEBUG nova.virt.hardware [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1563.324173] env[68906]: DEBUG nova.virt.hardware [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1563.325086] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e50eda95-060c-4f10-844d-94aa2bca270a {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1563.333426] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b1454f3-acc9-4f79-866e-cadc54296935 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1563.613053] env[68906]: DEBUG nova.network.neutron [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Successfully created port: 8363aad2-d556-481a-8209-73c10c7f8dbb {{(pid=68906) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1564.167645] env[68906]: DEBUG nova.network.neutron [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Successfully updated port: 8363aad2-d556-481a-8209-73c10c7f8dbb {{(pid=68906) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1564.177806] env[68906]: DEBUG oslo_concurrency.lockutils [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Acquiring lock "refresh_cache-32f5b54d-30bf-4fe9-9622-3ff74344b3f3" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1564.177972] env[68906]: DEBUG oslo_concurrency.lockutils [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Acquired lock "refresh_cache-32f5b54d-30bf-4fe9-9622-3ff74344b3f3" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1564.178138] env[68906]: DEBUG nova.network.neutron [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Building network info cache for instance {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1564.218219] env[68906]: DEBUG nova.network.neutron [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Instance cache missing network info. {{(pid=68906) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1564.368769] env[68906]: DEBUG nova.network.neutron [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Updating instance_info_cache with network_info: [{"id": "8363aad2-d556-481a-8209-73c10c7f8dbb", "address": "fa:16:3e:e7:a0:52", "network": {"id": "da6ba094-8e2a-4f76-813c-8668f482685b", "bridge": "br-int", "label": "tempest-ServersTestJSON-512380607-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "90f212f7916446919081fcdc0527ebb0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0cd5d325-3053-407e-a4ee-f627e82a23f9", "external-id": "nsx-vlan-transportzone-809", "segmentation_id": 809, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8363aad2-d5", "ovs_interfaceid": "8363aad2-d556-481a-8209-73c10c7f8dbb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1564.381917] env[68906]: DEBUG oslo_concurrency.lockutils [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Releasing lock "refresh_cache-32f5b54d-30bf-4fe9-9622-3ff74344b3f3" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1564.382306] env[68906]: DEBUG nova.compute.manager [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Instance network_info: |[{"id": "8363aad2-d556-481a-8209-73c10c7f8dbb", "address": "fa:16:3e:e7:a0:52", "network": {"id": "da6ba094-8e2a-4f76-813c-8668f482685b", "bridge": "br-int", "label": "tempest-ServersTestJSON-512380607-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "90f212f7916446919081fcdc0527ebb0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0cd5d325-3053-407e-a4ee-f627e82a23f9", "external-id": "nsx-vlan-transportzone-809", "segmentation_id": 809, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8363aad2-d5", "ovs_interfaceid": "8363aad2-d556-481a-8209-73c10c7f8dbb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1564.382788] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:e7:a0:52', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '0cd5d325-3053-407e-a4ee-f627e82a23f9', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '8363aad2-d556-481a-8209-73c10c7f8dbb', 'vif_model': 'vmxnet3'}] {{(pid=68906) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1564.392665] env[68906]: DEBUG oslo.service.loopingcall [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1564.393161] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Creating VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1564.393411] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-da2ec92d-2c37-4ab1-8e7a-405fe5f85ee3 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1564.413911] env[68906]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1564.413911] env[68906]: value = "task-3475409" [ 1564.413911] env[68906]: _type = "Task" [ 1564.413911] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1564.421524] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475409, 'name': CreateVM_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1564.794217] env[68906]: DEBUG nova.compute.manager [req-13b63762-4558-44e9-b608-3894981d239e req-cf89ad67-164a-4702-bd64-8d00fcac6a2c service nova] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Received event network-vif-plugged-8363aad2-d556-481a-8209-73c10c7f8dbb {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1564.794373] env[68906]: DEBUG oslo_concurrency.lockutils [req-13b63762-4558-44e9-b608-3894981d239e req-cf89ad67-164a-4702-bd64-8d00fcac6a2c service nova] Acquiring lock "32f5b54d-30bf-4fe9-9622-3ff74344b3f3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1564.794559] env[68906]: DEBUG oslo_concurrency.lockutils [req-13b63762-4558-44e9-b608-3894981d239e req-cf89ad67-164a-4702-bd64-8d00fcac6a2c service nova] Lock "32f5b54d-30bf-4fe9-9622-3ff74344b3f3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1564.794763] env[68906]: DEBUG oslo_concurrency.lockutils [req-13b63762-4558-44e9-b608-3894981d239e req-cf89ad67-164a-4702-bd64-8d00fcac6a2c service nova] Lock "32f5b54d-30bf-4fe9-9622-3ff74344b3f3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1564.794919] env[68906]: DEBUG nova.compute.manager [req-13b63762-4558-44e9-b608-3894981d239e req-cf89ad67-164a-4702-bd64-8d00fcac6a2c service nova] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] No waiting events found dispatching network-vif-plugged-8363aad2-d556-481a-8209-73c10c7f8dbb {{(pid=68906) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1564.795116] env[68906]: WARNING nova.compute.manager [req-13b63762-4558-44e9-b608-3894981d239e req-cf89ad67-164a-4702-bd64-8d00fcac6a2c service nova] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Received unexpected event network-vif-plugged-8363aad2-d556-481a-8209-73c10c7f8dbb for instance with vm_state building and task_state spawning. [ 1564.795283] env[68906]: DEBUG nova.compute.manager [req-13b63762-4558-44e9-b608-3894981d239e req-cf89ad67-164a-4702-bd64-8d00fcac6a2c service nova] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Received event network-changed-8363aad2-d556-481a-8209-73c10c7f8dbb {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1564.795456] env[68906]: DEBUG nova.compute.manager [req-13b63762-4558-44e9-b608-3894981d239e req-cf89ad67-164a-4702-bd64-8d00fcac6a2c service nova] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Refreshing instance network info cache due to event network-changed-8363aad2-d556-481a-8209-73c10c7f8dbb. {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1564.795603] env[68906]: DEBUG oslo_concurrency.lockutils [req-13b63762-4558-44e9-b608-3894981d239e req-cf89ad67-164a-4702-bd64-8d00fcac6a2c service nova] Acquiring lock "refresh_cache-32f5b54d-30bf-4fe9-9622-3ff74344b3f3" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1564.795773] env[68906]: DEBUG oslo_concurrency.lockutils [req-13b63762-4558-44e9-b608-3894981d239e req-cf89ad67-164a-4702-bd64-8d00fcac6a2c service nova] Acquired lock "refresh_cache-32f5b54d-30bf-4fe9-9622-3ff74344b3f3" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1564.795881] env[68906]: DEBUG nova.network.neutron [req-13b63762-4558-44e9-b608-3894981d239e req-cf89ad67-164a-4702-bd64-8d00fcac6a2c service nova] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Refreshing network info cache for port 8363aad2-d556-481a-8209-73c10c7f8dbb {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1564.926043] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475409, 'name': CreateVM_Task, 'duration_secs': 0.287996} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1564.926043] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Created VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1564.926043] env[68906]: DEBUG oslo_concurrency.lockutils [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1564.926043] env[68906]: DEBUG oslo_concurrency.lockutils [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1564.926043] env[68906]: DEBUG oslo_concurrency.lockutils [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1564.926275] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ead5cc16-4abc-44bb-aef4-cc8890f66f8d {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1564.929954] env[68906]: DEBUG oslo_vmware.api [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Waiting for the task: (returnval){ [ 1564.929954] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]52f10503-1335-4124-9deb-0e173a1510f6" [ 1564.929954] env[68906]: _type = "Task" [ 1564.929954] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1564.941703] env[68906]: DEBUG oslo_vmware.api [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]52f10503-1335-4124-9deb-0e173a1510f6, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1565.042913] env[68906]: DEBUG nova.network.neutron [req-13b63762-4558-44e9-b608-3894981d239e req-cf89ad67-164a-4702-bd64-8d00fcac6a2c service nova] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Updated VIF entry in instance network info cache for port 8363aad2-d556-481a-8209-73c10c7f8dbb. {{(pid=68906) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1565.043270] env[68906]: DEBUG nova.network.neutron [req-13b63762-4558-44e9-b608-3894981d239e req-cf89ad67-164a-4702-bd64-8d00fcac6a2c service nova] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Updating instance_info_cache with network_info: [{"id": "8363aad2-d556-481a-8209-73c10c7f8dbb", "address": "fa:16:3e:e7:a0:52", "network": {"id": "da6ba094-8e2a-4f76-813c-8668f482685b", "bridge": "br-int", "label": "tempest-ServersTestJSON-512380607-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "90f212f7916446919081fcdc0527ebb0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0cd5d325-3053-407e-a4ee-f627e82a23f9", "external-id": "nsx-vlan-transportzone-809", "segmentation_id": 809, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8363aad2-d5", "ovs_interfaceid": "8363aad2-d556-481a-8209-73c10c7f8dbb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1565.052777] env[68906]: DEBUG oslo_concurrency.lockutils [req-13b63762-4558-44e9-b608-3894981d239e req-cf89ad67-164a-4702-bd64-8d00fcac6a2c service nova] Releasing lock "refresh_cache-32f5b54d-30bf-4fe9-9622-3ff74344b3f3" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1565.443102] env[68906]: DEBUG oslo_concurrency.lockutils [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1565.443102] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Processing image b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1565.443102] env[68906]: DEBUG oslo_concurrency.lockutils [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1571.962229] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6a92d79c-99f6-49e2-a1ce-a661d04c8903 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Acquiring lock "17327bc3-433e-4006-93c7-e53714ed70c2" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1578.165731] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Acquiring lock "ce6e5cd6-efb8-46d1-811d-74c084661cce" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1578.166089] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Lock "ce6e5cd6-efb8-46d1-811d-74c084661cce" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1603.599681] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1606.140914] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1606.141301] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Starting heal instance info cache {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1606.141301] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Rebuilding the list of instances to heal {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1606.164098] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: db011373-7285-4882-8bce-d39cfa22fe80] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1606.164255] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1606.164387] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1606.164514] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1606.164635] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1606.164782] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 89171680-c76d-4826-9236-379542661ffb] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1606.164912] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1606.165043] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1606.165168] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1606.165287] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1606.165405] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Didn't find any instances for network info cache update. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1606.165914] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1609.140468] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1610.141295] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1610.370395] env[68906]: WARNING oslo_vmware.rw_handles [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1610.370395] env[68906]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1610.370395] env[68906]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1610.370395] env[68906]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1610.370395] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1610.370395] env[68906]: ERROR oslo_vmware.rw_handles response.begin() [ 1610.370395] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1610.370395] env[68906]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1610.370395] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1610.370395] env[68906]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1610.370395] env[68906]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1610.370395] env[68906]: ERROR oslo_vmware.rw_handles [ 1610.370949] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: db011373-7285-4882-8bce-d39cfa22fe80] Downloaded image file data b1400c31-d33b-4e13-944f-4c645e62493e to vmware_temp/e01e2753-288b-4e2b-9222-4184c7edcbf8/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1610.372600] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: db011373-7285-4882-8bce-d39cfa22fe80] Caching image {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1610.372857] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Copying Virtual Disk [datastore2] vmware_temp/e01e2753-288b-4e2b-9222-4184c7edcbf8/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk to [datastore2] vmware_temp/e01e2753-288b-4e2b-9222-4184c7edcbf8/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk {{(pid=68906) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1610.373157] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-6c9522d2-73f9-4a57-ad01-5a54fa3a6637 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1610.381574] env[68906]: DEBUG oslo_vmware.api [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Waiting for the task: (returnval){ [ 1610.381574] env[68906]: value = "task-3475410" [ 1610.381574] env[68906]: _type = "Task" [ 1610.381574] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1610.389808] env[68906]: DEBUG oslo_vmware.api [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Task: {'id': task-3475410, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1610.891451] env[68906]: DEBUG oslo_vmware.exceptions [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Fault InvalidArgument not matched. {{(pid=68906) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1610.891734] env[68906]: DEBUG oslo_concurrency.lockutils [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1610.892363] env[68906]: ERROR nova.compute.manager [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: db011373-7285-4882-8bce-d39cfa22fe80] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1610.892363] env[68906]: Faults: ['InvalidArgument'] [ 1610.892363] env[68906]: ERROR nova.compute.manager [instance: db011373-7285-4882-8bce-d39cfa22fe80] Traceback (most recent call last): [ 1610.892363] env[68906]: ERROR nova.compute.manager [instance: db011373-7285-4882-8bce-d39cfa22fe80] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1610.892363] env[68906]: ERROR nova.compute.manager [instance: db011373-7285-4882-8bce-d39cfa22fe80] yield resources [ 1610.892363] env[68906]: ERROR nova.compute.manager [instance: db011373-7285-4882-8bce-d39cfa22fe80] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1610.892363] env[68906]: ERROR nova.compute.manager [instance: db011373-7285-4882-8bce-d39cfa22fe80] self.driver.spawn(context, instance, image_meta, [ 1610.892363] env[68906]: ERROR nova.compute.manager [instance: db011373-7285-4882-8bce-d39cfa22fe80] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1610.892363] env[68906]: ERROR nova.compute.manager [instance: db011373-7285-4882-8bce-d39cfa22fe80] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1610.892363] env[68906]: ERROR nova.compute.manager [instance: db011373-7285-4882-8bce-d39cfa22fe80] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1610.892363] env[68906]: ERROR nova.compute.manager [instance: db011373-7285-4882-8bce-d39cfa22fe80] self._fetch_image_if_missing(context, vi) [ 1610.892363] env[68906]: ERROR nova.compute.manager [instance: db011373-7285-4882-8bce-d39cfa22fe80] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1610.892729] env[68906]: ERROR nova.compute.manager [instance: db011373-7285-4882-8bce-d39cfa22fe80] image_cache(vi, tmp_image_ds_loc) [ 1610.892729] env[68906]: ERROR nova.compute.manager [instance: db011373-7285-4882-8bce-d39cfa22fe80] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1610.892729] env[68906]: ERROR nova.compute.manager [instance: db011373-7285-4882-8bce-d39cfa22fe80] vm_util.copy_virtual_disk( [ 1610.892729] env[68906]: ERROR nova.compute.manager [instance: db011373-7285-4882-8bce-d39cfa22fe80] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1610.892729] env[68906]: ERROR nova.compute.manager [instance: db011373-7285-4882-8bce-d39cfa22fe80] session._wait_for_task(vmdk_copy_task) [ 1610.892729] env[68906]: ERROR nova.compute.manager [instance: db011373-7285-4882-8bce-d39cfa22fe80] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1610.892729] env[68906]: ERROR nova.compute.manager [instance: db011373-7285-4882-8bce-d39cfa22fe80] return self.wait_for_task(task_ref) [ 1610.892729] env[68906]: ERROR nova.compute.manager [instance: db011373-7285-4882-8bce-d39cfa22fe80] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1610.892729] env[68906]: ERROR nova.compute.manager [instance: db011373-7285-4882-8bce-d39cfa22fe80] return evt.wait() [ 1610.892729] env[68906]: ERROR nova.compute.manager [instance: db011373-7285-4882-8bce-d39cfa22fe80] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1610.892729] env[68906]: ERROR nova.compute.manager [instance: db011373-7285-4882-8bce-d39cfa22fe80] result = hub.switch() [ 1610.892729] env[68906]: ERROR nova.compute.manager [instance: db011373-7285-4882-8bce-d39cfa22fe80] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1610.892729] env[68906]: ERROR nova.compute.manager [instance: db011373-7285-4882-8bce-d39cfa22fe80] return self.greenlet.switch() [ 1610.893118] env[68906]: ERROR nova.compute.manager [instance: db011373-7285-4882-8bce-d39cfa22fe80] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1610.893118] env[68906]: ERROR nova.compute.manager [instance: db011373-7285-4882-8bce-d39cfa22fe80] self.f(*self.args, **self.kw) [ 1610.893118] env[68906]: ERROR nova.compute.manager [instance: db011373-7285-4882-8bce-d39cfa22fe80] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1610.893118] env[68906]: ERROR nova.compute.manager [instance: db011373-7285-4882-8bce-d39cfa22fe80] raise exceptions.translate_fault(task_info.error) [ 1610.893118] env[68906]: ERROR nova.compute.manager [instance: db011373-7285-4882-8bce-d39cfa22fe80] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1610.893118] env[68906]: ERROR nova.compute.manager [instance: db011373-7285-4882-8bce-d39cfa22fe80] Faults: ['InvalidArgument'] [ 1610.893118] env[68906]: ERROR nova.compute.manager [instance: db011373-7285-4882-8bce-d39cfa22fe80] [ 1610.893118] env[68906]: INFO nova.compute.manager [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: db011373-7285-4882-8bce-d39cfa22fe80] Terminating instance [ 1610.894222] env[68906]: DEBUG oslo_concurrency.lockutils [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1610.894426] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1610.894662] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a0ce9e5f-ba09-43b8-bcf8-4416eb3a6f9e {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1610.896907] env[68906]: DEBUG nova.compute.manager [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: db011373-7285-4882-8bce-d39cfa22fe80] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1610.897142] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: db011373-7285-4882-8bce-d39cfa22fe80] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1610.897857] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e03f5f8b-dcdd-446a-8dfa-a18a38282fa0 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1610.904835] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: db011373-7285-4882-8bce-d39cfa22fe80] Unregistering the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1610.905073] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-71c9910e-6d92-4cbd-98e1-7bf26e41446c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1610.907301] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1610.907471] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68906) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1610.908465] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-16c3916a-8275-4d09-84a6-cc067d2ccac1 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1610.913015] env[68906]: DEBUG oslo_vmware.api [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Waiting for the task: (returnval){ [ 1610.913015] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]521bb52f-f833-882e-33cb-a06daa1b009c" [ 1610.913015] env[68906]: _type = "Task" [ 1610.913015] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1610.920962] env[68906]: DEBUG oslo_vmware.api [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]521bb52f-f833-882e-33cb-a06daa1b009c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1610.985297] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: db011373-7285-4882-8bce-d39cfa22fe80] Unregistered the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1610.985518] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: db011373-7285-4882-8bce-d39cfa22fe80] Deleting contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1610.985687] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Deleting the datastore file [datastore2] db011373-7285-4882-8bce-d39cfa22fe80 {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1610.985942] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-3166c631-b242-46bb-af78-babf2e3afd37 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1610.991935] env[68906]: DEBUG oslo_vmware.api [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Waiting for the task: (returnval){ [ 1610.991935] env[68906]: value = "task-3475412" [ 1610.991935] env[68906]: _type = "Task" [ 1610.991935] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1610.999401] env[68906]: DEBUG oslo_vmware.api [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Task: {'id': task-3475412, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1611.140178] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1611.140391] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68906) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1611.422858] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Preparing fetch location {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1611.423192] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Creating directory with path [datastore2] vmware_temp/69489f94-cac9-44bb-b420-b272298929a3/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1611.423400] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-46d36025-7863-4173-9f4c-fc92ff5cc446 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1611.433914] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Created directory with path [datastore2] vmware_temp/69489f94-cac9-44bb-b420-b272298929a3/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1611.434111] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Fetch image to [datastore2] vmware_temp/69489f94-cac9-44bb-b420-b272298929a3/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1611.434282] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to [datastore2] vmware_temp/69489f94-cac9-44bb-b420-b272298929a3/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1611.435018] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-101af218-3e64-4b48-8372-9fc29771cfc0 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1611.441269] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b062af4-3e5a-4eb1-85f9-03a75e0ac409 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1611.449965] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d350a79-7ef0-42c7-bd3b-a7e6df003e88 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1611.480310] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f1b565e4-a3f2-4c79-8900-7e27a1452a35 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1611.485423] env[68906]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-6e8851b6-7692-44d9-b807-cbe71009f0d4 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1611.499540] env[68906]: DEBUG oslo_vmware.api [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Task: {'id': task-3475412, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.063514} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1611.499765] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Deleted the datastore file {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1611.499945] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: db011373-7285-4882-8bce-d39cfa22fe80] Deleted contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1611.500131] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: db011373-7285-4882-8bce-d39cfa22fe80] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1611.500309] env[68906]: INFO nova.compute.manager [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: db011373-7285-4882-8bce-d39cfa22fe80] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1611.502311] env[68906]: DEBUG nova.compute.claims [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: db011373-7285-4882-8bce-d39cfa22fe80] Aborting claim: {{(pid=68906) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1611.502483] env[68906]: DEBUG oslo_concurrency.lockutils [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1611.502695] env[68906]: DEBUG oslo_concurrency.lockutils [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1611.508862] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1611.560453] env[68906]: DEBUG oslo_vmware.rw_handles [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/69489f94-cac9-44bb-b420-b272298929a3/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1611.618608] env[68906]: DEBUG oslo_vmware.rw_handles [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Completed reading data from the image iterator. {{(pid=68906) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1611.618792] env[68906]: DEBUG oslo_vmware.rw_handles [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/69489f94-cac9-44bb-b420-b272298929a3/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1611.772295] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5541e632-ffce-4119-8257-41def1f272be {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1611.780270] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-065be79d-d3e2-46d0-8a2e-d4704c331920 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1611.809735] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-652880b4-0a41-4991-a16e-957486e60091 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1611.816547] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9960db12-80f4-4608-82e0-1c1e07fd4c4d {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1611.828958] env[68906]: DEBUG nova.compute.provider_tree [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1611.837417] env[68906]: DEBUG nova.scheduler.client.report [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1611.850507] env[68906]: DEBUG oslo_concurrency.lockutils [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.348s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1611.851037] env[68906]: ERROR nova.compute.manager [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: db011373-7285-4882-8bce-d39cfa22fe80] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1611.851037] env[68906]: Faults: ['InvalidArgument'] [ 1611.851037] env[68906]: ERROR nova.compute.manager [instance: db011373-7285-4882-8bce-d39cfa22fe80] Traceback (most recent call last): [ 1611.851037] env[68906]: ERROR nova.compute.manager [instance: db011373-7285-4882-8bce-d39cfa22fe80] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1611.851037] env[68906]: ERROR nova.compute.manager [instance: db011373-7285-4882-8bce-d39cfa22fe80] self.driver.spawn(context, instance, image_meta, [ 1611.851037] env[68906]: ERROR nova.compute.manager [instance: db011373-7285-4882-8bce-d39cfa22fe80] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1611.851037] env[68906]: ERROR nova.compute.manager [instance: db011373-7285-4882-8bce-d39cfa22fe80] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1611.851037] env[68906]: ERROR nova.compute.manager [instance: db011373-7285-4882-8bce-d39cfa22fe80] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1611.851037] env[68906]: ERROR nova.compute.manager [instance: db011373-7285-4882-8bce-d39cfa22fe80] self._fetch_image_if_missing(context, vi) [ 1611.851037] env[68906]: ERROR nova.compute.manager [instance: db011373-7285-4882-8bce-d39cfa22fe80] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1611.851037] env[68906]: ERROR nova.compute.manager [instance: db011373-7285-4882-8bce-d39cfa22fe80] image_cache(vi, tmp_image_ds_loc) [ 1611.851037] env[68906]: ERROR nova.compute.manager [instance: db011373-7285-4882-8bce-d39cfa22fe80] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1611.851356] env[68906]: ERROR nova.compute.manager [instance: db011373-7285-4882-8bce-d39cfa22fe80] vm_util.copy_virtual_disk( [ 1611.851356] env[68906]: ERROR nova.compute.manager [instance: db011373-7285-4882-8bce-d39cfa22fe80] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1611.851356] env[68906]: ERROR nova.compute.manager [instance: db011373-7285-4882-8bce-d39cfa22fe80] session._wait_for_task(vmdk_copy_task) [ 1611.851356] env[68906]: ERROR nova.compute.manager [instance: db011373-7285-4882-8bce-d39cfa22fe80] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1611.851356] env[68906]: ERROR nova.compute.manager [instance: db011373-7285-4882-8bce-d39cfa22fe80] return self.wait_for_task(task_ref) [ 1611.851356] env[68906]: ERROR nova.compute.manager [instance: db011373-7285-4882-8bce-d39cfa22fe80] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1611.851356] env[68906]: ERROR nova.compute.manager [instance: db011373-7285-4882-8bce-d39cfa22fe80] return evt.wait() [ 1611.851356] env[68906]: ERROR nova.compute.manager [instance: db011373-7285-4882-8bce-d39cfa22fe80] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1611.851356] env[68906]: ERROR nova.compute.manager [instance: db011373-7285-4882-8bce-d39cfa22fe80] result = hub.switch() [ 1611.851356] env[68906]: ERROR nova.compute.manager [instance: db011373-7285-4882-8bce-d39cfa22fe80] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1611.851356] env[68906]: ERROR nova.compute.manager [instance: db011373-7285-4882-8bce-d39cfa22fe80] return self.greenlet.switch() [ 1611.851356] env[68906]: ERROR nova.compute.manager [instance: db011373-7285-4882-8bce-d39cfa22fe80] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1611.851356] env[68906]: ERROR nova.compute.manager [instance: db011373-7285-4882-8bce-d39cfa22fe80] self.f(*self.args, **self.kw) [ 1611.851668] env[68906]: ERROR nova.compute.manager [instance: db011373-7285-4882-8bce-d39cfa22fe80] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1611.851668] env[68906]: ERROR nova.compute.manager [instance: db011373-7285-4882-8bce-d39cfa22fe80] raise exceptions.translate_fault(task_info.error) [ 1611.851668] env[68906]: ERROR nova.compute.manager [instance: db011373-7285-4882-8bce-d39cfa22fe80] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1611.851668] env[68906]: ERROR nova.compute.manager [instance: db011373-7285-4882-8bce-d39cfa22fe80] Faults: ['InvalidArgument'] [ 1611.851668] env[68906]: ERROR nova.compute.manager [instance: db011373-7285-4882-8bce-d39cfa22fe80] [ 1611.851803] env[68906]: DEBUG nova.compute.utils [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: db011373-7285-4882-8bce-d39cfa22fe80] VimFaultException {{(pid=68906) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1611.853053] env[68906]: DEBUG nova.compute.manager [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: db011373-7285-4882-8bce-d39cfa22fe80] Build of instance db011373-7285-4882-8bce-d39cfa22fe80 was re-scheduled: A specified parameter was not correct: fileType [ 1611.853053] env[68906]: Faults: ['InvalidArgument'] {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1611.853458] env[68906]: DEBUG nova.compute.manager [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: db011373-7285-4882-8bce-d39cfa22fe80] Unplugging VIFs for instance {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1611.853635] env[68906]: DEBUG nova.compute.manager [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1611.853804] env[68906]: DEBUG nova.compute.manager [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: db011373-7285-4882-8bce-d39cfa22fe80] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1611.853966] env[68906]: DEBUG nova.network.neutron [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: db011373-7285-4882-8bce-d39cfa22fe80] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1612.141196] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1612.166571] env[68906]: DEBUG nova.network.neutron [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: db011373-7285-4882-8bce-d39cfa22fe80] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1612.181441] env[68906]: INFO nova.compute.manager [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: db011373-7285-4882-8bce-d39cfa22fe80] Took 0.33 seconds to deallocate network for instance. [ 1612.270286] env[68906]: INFO nova.scheduler.client.report [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Deleted allocations for instance db011373-7285-4882-8bce-d39cfa22fe80 [ 1612.291537] env[68906]: DEBUG oslo_concurrency.lockutils [None req-4d72aadc-5a5f-4626-b220-381c1afbdf3f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Lock "db011373-7285-4882-8bce-d39cfa22fe80" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 631.276s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1612.292868] env[68906]: DEBUG oslo_concurrency.lockutils [None req-84b4443f-5b82-4301-af42-70e59908a5fd tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Lock "db011373-7285-4882-8bce-d39cfa22fe80" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 434.355s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1612.292868] env[68906]: DEBUG oslo_concurrency.lockutils [None req-84b4443f-5b82-4301-af42-70e59908a5fd tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Acquiring lock "db011373-7285-4882-8bce-d39cfa22fe80-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1612.293084] env[68906]: DEBUG oslo_concurrency.lockutils [None req-84b4443f-5b82-4301-af42-70e59908a5fd tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Lock "db011373-7285-4882-8bce-d39cfa22fe80-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1612.293226] env[68906]: DEBUG oslo_concurrency.lockutils [None req-84b4443f-5b82-4301-af42-70e59908a5fd tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Lock "db011373-7285-4882-8bce-d39cfa22fe80-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1612.295165] env[68906]: INFO nova.compute.manager [None req-84b4443f-5b82-4301-af42-70e59908a5fd tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: db011373-7285-4882-8bce-d39cfa22fe80] Terminating instance [ 1612.296814] env[68906]: DEBUG nova.compute.manager [None req-84b4443f-5b82-4301-af42-70e59908a5fd tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: db011373-7285-4882-8bce-d39cfa22fe80] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1612.297015] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-84b4443f-5b82-4301-af42-70e59908a5fd tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: db011373-7285-4882-8bce-d39cfa22fe80] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1612.297475] env[68906]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-c3666a4d-b124-4b99-b208-7f7135ae9710 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1612.306533] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09a9ed43-2d30-44b6-b569-5054368160b3 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1612.317455] env[68906]: DEBUG nova.compute.manager [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1612.339282] env[68906]: WARNING nova.virt.vmwareapi.vmops [None req-84b4443f-5b82-4301-af42-70e59908a5fd tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: db011373-7285-4882-8bce-d39cfa22fe80] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance db011373-7285-4882-8bce-d39cfa22fe80 could not be found. [ 1612.339282] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-84b4443f-5b82-4301-af42-70e59908a5fd tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: db011373-7285-4882-8bce-d39cfa22fe80] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1612.339282] env[68906]: INFO nova.compute.manager [None req-84b4443f-5b82-4301-af42-70e59908a5fd tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: db011373-7285-4882-8bce-d39cfa22fe80] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1612.339456] env[68906]: DEBUG oslo.service.loopingcall [None req-84b4443f-5b82-4301-af42-70e59908a5fd tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1612.339548] env[68906]: DEBUG nova.compute.manager [-] [instance: db011373-7285-4882-8bce-d39cfa22fe80] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1612.339631] env[68906]: DEBUG nova.network.neutron [-] [instance: db011373-7285-4882-8bce-d39cfa22fe80] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1612.370405] env[68906]: DEBUG nova.network.neutron [-] [instance: db011373-7285-4882-8bce-d39cfa22fe80] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1612.372141] env[68906]: DEBUG oslo_concurrency.lockutils [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1612.372409] env[68906]: DEBUG oslo_concurrency.lockutils [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1612.373950] env[68906]: INFO nova.compute.claims [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1612.378162] env[68906]: INFO nova.compute.manager [-] [instance: db011373-7285-4882-8bce-d39cfa22fe80] Took 0.04 seconds to deallocate network for instance. [ 1612.502332] env[68906]: DEBUG oslo_concurrency.lockutils [None req-84b4443f-5b82-4301-af42-70e59908a5fd tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Lock "db011373-7285-4882-8bce-d39cfa22fe80" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.210s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1612.503380] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "db011373-7285-4882-8bce-d39cfa22fe80" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 401.550s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1612.503647] env[68906]: INFO nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: db011373-7285-4882-8bce-d39cfa22fe80] During sync_power_state the instance has a pending task (deleting). Skip. [ 1612.503732] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "db011373-7285-4882-8bce-d39cfa22fe80" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1612.603881] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb14f113-9147-4ae1-b398-f2c688f142f5 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1612.611574] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-87de0952-d05d-4f81-9dcc-7def4d7bbbb1 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1612.641593] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5deaafe5-24a0-49dc-8835-c7742a9c7238 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1612.648191] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-147e77a7-5829-4453-8ac8-69766f3669a2 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1612.661228] env[68906]: DEBUG nova.compute.provider_tree [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1612.670812] env[68906]: DEBUG nova.scheduler.client.report [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1612.684309] env[68906]: DEBUG oslo_concurrency.lockutils [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.312s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1612.684765] env[68906]: DEBUG nova.compute.manager [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Start building networks asynchronously for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1612.716979] env[68906]: DEBUG nova.compute.utils [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Using /dev/sd instead of None {{(pid=68906) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1612.718356] env[68906]: DEBUG nova.compute.manager [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Allocating IP information in the background. {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1612.718528] env[68906]: DEBUG nova.network.neutron [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] allocate_for_instance() {{(pid=68906) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1612.728690] env[68906]: DEBUG nova.compute.manager [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Start building block device mappings for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1612.776687] env[68906]: DEBUG nova.policy [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '97b90d0bf6244d02bb9f4133aa781bd8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '530f8be6c3934b3aa339c5c3e09cf9d9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68906) authorize /opt/stack/nova/nova/policy.py:203}} [ 1612.793357] env[68906]: DEBUG nova.compute.manager [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Start spawning the instance on the hypervisor. {{(pid=68906) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1612.818201] env[68906]: DEBUG nova.virt.hardware [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T13:00:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T13:00:23Z,direct_url=,disk_format='vmdk',id=b1400c31-d33b-4e13-944f-4c645e62493e,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='1ae7bf3a375d41c6af5e7536af51ffd1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T13:00:24Z,virtual_size=,visibility=), allow threads: False {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1612.818489] env[68906]: DEBUG nova.virt.hardware [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Flavor limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1612.818659] env[68906]: DEBUG nova.virt.hardware [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Image limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1612.818841] env[68906]: DEBUG nova.virt.hardware [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Flavor pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1612.818990] env[68906]: DEBUG nova.virt.hardware [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Image pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1612.819157] env[68906]: DEBUG nova.virt.hardware [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1612.819465] env[68906]: DEBUG nova.virt.hardware [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1612.819642] env[68906]: DEBUG nova.virt.hardware [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1612.819810] env[68906]: DEBUG nova.virt.hardware [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Got 1 possible topologies {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1612.819973] env[68906]: DEBUG nova.virt.hardware [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1612.820164] env[68906]: DEBUG nova.virt.hardware [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1612.820989] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d3a6ef52-4d1c-4779-8080-d473ac45381c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1612.829020] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-58232800-b38e-42b9-8b1b-3e35a17087f1 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1613.085754] env[68906]: DEBUG nova.network.neutron [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Successfully created port: e173920e-d4f1-47fb-88de-682211e9a34b {{(pid=68906) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1613.140796] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager.update_available_resource {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1613.155503] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1613.155741] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1613.155940] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1613.156124] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68906) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1613.157581] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-90f471be-3cc0-41d0-bf4f-85d1f612b3df {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1613.169473] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4492ae06-bc3d-464d-88a0-87d96a3c640f {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1613.186231] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cdac8400-0d2c-4ac4-83ac-cb9791a06039 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1613.193947] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-552d0ba6-1607-48e5-939d-7e413b59862c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1613.226435] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180950MB free_disk=93GB free_vcpus=48 pci_devices=None {{(pid=68906) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1613.226435] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1613.226435] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1613.325150] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 1fdb401a-ac25-4418-803c-fc0b2297f2d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1613.325331] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1613.325463] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance e0e595e3-e47e-4cf1-8977-f004eca942d1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1613.325585] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 7466df8a-59a9-49b9-bff7-c4efbeae3eee actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1613.325706] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 89171680-c76d-4826-9236-379542661ffb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1613.325827] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 9b884416-df89-4d8c-b2ab-0667db52a718 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1613.325979] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance aed06616-d008-4695-b66e-9f40acf5ebd3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1613.326120] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 17327bc3-433e-4006-93c7-e53714ed70c2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1613.326238] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 32f5b54d-30bf-4fe9-9622-3ff74344b3f3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1613.326353] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 922d81ba-c8d2-43ba-b1c5-f2943418d6a2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1613.340490] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 302e2275-a3ec-48c5-899e-6f385190bfe8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1613.353893] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance a59ab448-c4f1-4f54-be7a-7e204130f3f8 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1613.365762] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 736db39c-e5e5-4a54-b85a-aa5c703f432e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1613.377546] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 7faa4c32-7572-4594-a760-e928607bf2b6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1613.388443] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance ce6e5cd6-efb8-46d1-811d-74c084661cce has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1613.388715] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1613.388922] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1613.575796] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5961eb74-4d27-42a2-893f-3566a13e2279 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1613.583789] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dcdf95bd-52fb-4e56-914b-aa5c0a5fbcc4 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1613.614692] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8711ea77-f918-4b03-adb6-9e17ac00d890 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1613.622580] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-709b7b1a-2fd1-4e9b-87e5-c041ea17da43 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1613.635960] env[68906]: DEBUG nova.compute.provider_tree [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1613.644188] env[68906]: DEBUG nova.scheduler.client.report [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1613.659942] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68906) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1613.660159] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.434s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1613.723316] env[68906]: DEBUG nova.network.neutron [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Successfully updated port: e173920e-d4f1-47fb-88de-682211e9a34b {{(pid=68906) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1613.733355] env[68906]: DEBUG oslo_concurrency.lockutils [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Acquiring lock "refresh_cache-922d81ba-c8d2-43ba-b1c5-f2943418d6a2" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1613.733499] env[68906]: DEBUG oslo_concurrency.lockutils [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Acquired lock "refresh_cache-922d81ba-c8d2-43ba-b1c5-f2943418d6a2" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1613.733647] env[68906]: DEBUG nova.network.neutron [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Building network info cache for instance {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1613.771922] env[68906]: DEBUG nova.network.neutron [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Instance cache missing network info. {{(pid=68906) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1613.924560] env[68906]: DEBUG nova.network.neutron [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Updating instance_info_cache with network_info: [{"id": "e173920e-d4f1-47fb-88de-682211e9a34b", "address": "fa:16:3e:0a:a3:bd", "network": {"id": "fbd576ff-c6cf-4609-ba79-251b3480702c", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1924323004-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "530f8be6c3934b3aa339c5c3e09cf9d9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8fedd232-bfc1-4e7f-bd5e-c43ef8f2f08a", "external-id": "nsx-vlan-transportzone-925", "segmentation_id": 925, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape173920e-d4", "ovs_interfaceid": "e173920e-d4f1-47fb-88de-682211e9a34b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1613.936545] env[68906]: DEBUG oslo_concurrency.lockutils [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Releasing lock "refresh_cache-922d81ba-c8d2-43ba-b1c5-f2943418d6a2" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1613.936841] env[68906]: DEBUG nova.compute.manager [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Instance network_info: |[{"id": "e173920e-d4f1-47fb-88de-682211e9a34b", "address": "fa:16:3e:0a:a3:bd", "network": {"id": "fbd576ff-c6cf-4609-ba79-251b3480702c", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1924323004-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "530f8be6c3934b3aa339c5c3e09cf9d9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8fedd232-bfc1-4e7f-bd5e-c43ef8f2f08a", "external-id": "nsx-vlan-transportzone-925", "segmentation_id": 925, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape173920e-d4", "ovs_interfaceid": "e173920e-d4f1-47fb-88de-682211e9a34b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1613.937243] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:0a:a3:bd', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '8fedd232-bfc1-4e7f-bd5e-c43ef8f2f08a', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'e173920e-d4f1-47fb-88de-682211e9a34b', 'vif_model': 'vmxnet3'}] {{(pid=68906) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1613.945139] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Creating folder: Project (530f8be6c3934b3aa339c5c3e09cf9d9). Parent ref: group-v694750. {{(pid=68906) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1613.946087] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7ee3c18d-e660-493a-a068-b2b46b86086a {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1613.956553] env[68906]: INFO nova.virt.vmwareapi.vm_util [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Created folder: Project (530f8be6c3934b3aa339c5c3e09cf9d9) in parent group-v694750. [ 1613.956726] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Creating folder: Instances. Parent ref: group-v694837. {{(pid=68906) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1613.956942] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9f50b3b0-4229-4577-a62d-f441da60bcf0 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1613.965407] env[68906]: INFO nova.virt.vmwareapi.vm_util [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Created folder: Instances in parent group-v694837. [ 1613.965627] env[68906]: DEBUG oslo.service.loopingcall [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1613.965802] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Creating VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1613.965987] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-3901c75e-dda7-42da-9121-a69b448e90ce {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1613.983785] env[68906]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1613.983785] env[68906]: value = "task-3475415" [ 1613.983785] env[68906]: _type = "Task" [ 1613.983785] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1613.990688] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475415, 'name': CreateVM_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1614.333405] env[68906]: DEBUG nova.compute.manager [req-f0f90bd2-93c1-42ff-b35d-942ff00f1fb5 req-bfbbcf8a-9319-49f7-b5cb-da18aee5fa51 service nova] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Received event network-vif-plugged-e173920e-d4f1-47fb-88de-682211e9a34b {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1614.333623] env[68906]: DEBUG oslo_concurrency.lockutils [req-f0f90bd2-93c1-42ff-b35d-942ff00f1fb5 req-bfbbcf8a-9319-49f7-b5cb-da18aee5fa51 service nova] Acquiring lock "922d81ba-c8d2-43ba-b1c5-f2943418d6a2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1614.333788] env[68906]: DEBUG oslo_concurrency.lockutils [req-f0f90bd2-93c1-42ff-b35d-942ff00f1fb5 req-bfbbcf8a-9319-49f7-b5cb-da18aee5fa51 service nova] Lock "922d81ba-c8d2-43ba-b1c5-f2943418d6a2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1614.333963] env[68906]: DEBUG oslo_concurrency.lockutils [req-f0f90bd2-93c1-42ff-b35d-942ff00f1fb5 req-bfbbcf8a-9319-49f7-b5cb-da18aee5fa51 service nova] Lock "922d81ba-c8d2-43ba-b1c5-f2943418d6a2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1614.334147] env[68906]: DEBUG nova.compute.manager [req-f0f90bd2-93c1-42ff-b35d-942ff00f1fb5 req-bfbbcf8a-9319-49f7-b5cb-da18aee5fa51 service nova] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] No waiting events found dispatching network-vif-plugged-e173920e-d4f1-47fb-88de-682211e9a34b {{(pid=68906) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1614.334312] env[68906]: WARNING nova.compute.manager [req-f0f90bd2-93c1-42ff-b35d-942ff00f1fb5 req-bfbbcf8a-9319-49f7-b5cb-da18aee5fa51 service nova] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Received unexpected event network-vif-plugged-e173920e-d4f1-47fb-88de-682211e9a34b for instance with vm_state building and task_state spawning. [ 1614.334479] env[68906]: DEBUG nova.compute.manager [req-f0f90bd2-93c1-42ff-b35d-942ff00f1fb5 req-bfbbcf8a-9319-49f7-b5cb-da18aee5fa51 service nova] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Received event network-changed-e173920e-d4f1-47fb-88de-682211e9a34b {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1614.334685] env[68906]: DEBUG nova.compute.manager [req-f0f90bd2-93c1-42ff-b35d-942ff00f1fb5 req-bfbbcf8a-9319-49f7-b5cb-da18aee5fa51 service nova] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Refreshing instance network info cache due to event network-changed-e173920e-d4f1-47fb-88de-682211e9a34b. {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1614.334809] env[68906]: DEBUG oslo_concurrency.lockutils [req-f0f90bd2-93c1-42ff-b35d-942ff00f1fb5 req-bfbbcf8a-9319-49f7-b5cb-da18aee5fa51 service nova] Acquiring lock "refresh_cache-922d81ba-c8d2-43ba-b1c5-f2943418d6a2" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1614.335077] env[68906]: DEBUG oslo_concurrency.lockutils [req-f0f90bd2-93c1-42ff-b35d-942ff00f1fb5 req-bfbbcf8a-9319-49f7-b5cb-da18aee5fa51 service nova] Acquired lock "refresh_cache-922d81ba-c8d2-43ba-b1c5-f2943418d6a2" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1614.335168] env[68906]: DEBUG nova.network.neutron [req-f0f90bd2-93c1-42ff-b35d-942ff00f1fb5 req-bfbbcf8a-9319-49f7-b5cb-da18aee5fa51 service nova] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Refreshing network info cache for port e173920e-d4f1-47fb-88de-682211e9a34b {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1614.495372] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475415, 'name': CreateVM_Task, 'duration_secs': 0.288492} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1614.495554] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Created VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1614.496193] env[68906]: DEBUG oslo_concurrency.lockutils [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1614.496363] env[68906]: DEBUG oslo_concurrency.lockutils [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1614.496706] env[68906]: DEBUG oslo_concurrency.lockutils [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1614.496913] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c542b40b-9b86-46ba-abfd-086b01d4c5eb {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1614.501350] env[68906]: DEBUG oslo_vmware.api [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Waiting for the task: (returnval){ [ 1614.501350] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]52801a5e-41d6-e981-afee-2330a5dd7d6b" [ 1614.501350] env[68906]: _type = "Task" [ 1614.501350] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1614.510216] env[68906]: DEBUG oslo_vmware.api [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]52801a5e-41d6-e981-afee-2330a5dd7d6b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1614.575300] env[68906]: DEBUG nova.network.neutron [req-f0f90bd2-93c1-42ff-b35d-942ff00f1fb5 req-bfbbcf8a-9319-49f7-b5cb-da18aee5fa51 service nova] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Updated VIF entry in instance network info cache for port e173920e-d4f1-47fb-88de-682211e9a34b. {{(pid=68906) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1614.575644] env[68906]: DEBUG nova.network.neutron [req-f0f90bd2-93c1-42ff-b35d-942ff00f1fb5 req-bfbbcf8a-9319-49f7-b5cb-da18aee5fa51 service nova] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Updating instance_info_cache with network_info: [{"id": "e173920e-d4f1-47fb-88de-682211e9a34b", "address": "fa:16:3e:0a:a3:bd", "network": {"id": "fbd576ff-c6cf-4609-ba79-251b3480702c", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1924323004-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "530f8be6c3934b3aa339c5c3e09cf9d9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8fedd232-bfc1-4e7f-bd5e-c43ef8f2f08a", "external-id": "nsx-vlan-transportzone-925", "segmentation_id": 925, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape173920e-d4", "ovs_interfaceid": "e173920e-d4f1-47fb-88de-682211e9a34b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1614.584965] env[68906]: DEBUG oslo_concurrency.lockutils [req-f0f90bd2-93c1-42ff-b35d-942ff00f1fb5 req-bfbbcf8a-9319-49f7-b5cb-da18aee5fa51 service nova] Releasing lock "refresh_cache-922d81ba-c8d2-43ba-b1c5-f2943418d6a2" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1614.654600] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1615.012385] env[68906]: DEBUG oslo_concurrency.lockutils [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1615.012594] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Processing image b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1615.012870] env[68906]: DEBUG oslo_concurrency.lockutils [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1617.948878] env[68906]: DEBUG oslo_concurrency.lockutils [None req-576bfd48-3236-4615-8685-9b78101a6ee3 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Acquiring lock "32f5b54d-30bf-4fe9-9622-3ff74344b3f3" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1631.470224] env[68906]: DEBUG oslo_concurrency.lockutils [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Acquiring lock "7994d291-b4bf-48f5-ad34-c1f484d77f6e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1631.470660] env[68906]: DEBUG oslo_concurrency.lockutils [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Lock "7994d291-b4bf-48f5-ad34-c1f484d77f6e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1647.957874] env[68906]: DEBUG oslo_concurrency.lockutils [None req-23bcf793-7882-41c0-a6ef-da7b87706b50 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Acquiring lock "922d81ba-c8d2-43ba-b1c5-f2943418d6a2" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1652.845219] env[68906]: DEBUG oslo_concurrency.lockutils [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Acquiring lock "860248ea-e77b-4ff6-af64-b75f88a31348" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1652.845535] env[68906]: DEBUG oslo_concurrency.lockutils [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Lock "860248ea-e77b-4ff6-af64-b75f88a31348" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1657.837899] env[68906]: WARNING oslo_vmware.rw_handles [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1657.837899] env[68906]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1657.837899] env[68906]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1657.837899] env[68906]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1657.837899] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1657.837899] env[68906]: ERROR oslo_vmware.rw_handles response.begin() [ 1657.837899] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1657.837899] env[68906]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1657.837899] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1657.837899] env[68906]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1657.837899] env[68906]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1657.837899] env[68906]: ERROR oslo_vmware.rw_handles [ 1657.838745] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Downloaded image file data b1400c31-d33b-4e13-944f-4c645e62493e to vmware_temp/69489f94-cac9-44bb-b420-b272298929a3/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1657.840204] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Caching image {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1657.840432] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Copying Virtual Disk [datastore2] vmware_temp/69489f94-cac9-44bb-b420-b272298929a3/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk to [datastore2] vmware_temp/69489f94-cac9-44bb-b420-b272298929a3/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk {{(pid=68906) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1657.840720] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-53d65036-04e5-4692-8add-bf6e3660a027 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1657.848269] env[68906]: DEBUG oslo_vmware.api [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Waiting for the task: (returnval){ [ 1657.848269] env[68906]: value = "task-3475416" [ 1657.848269] env[68906]: _type = "Task" [ 1657.848269] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1657.856351] env[68906]: DEBUG oslo_vmware.api [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Task: {'id': task-3475416, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1658.358462] env[68906]: DEBUG oslo_vmware.exceptions [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Fault InvalidArgument not matched. {{(pid=68906) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1658.358624] env[68906]: DEBUG oslo_concurrency.lockutils [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1658.359238] env[68906]: ERROR nova.compute.manager [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1658.359238] env[68906]: Faults: ['InvalidArgument'] [ 1658.359238] env[68906]: ERROR nova.compute.manager [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Traceback (most recent call last): [ 1658.359238] env[68906]: ERROR nova.compute.manager [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1658.359238] env[68906]: ERROR nova.compute.manager [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] yield resources [ 1658.359238] env[68906]: ERROR nova.compute.manager [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1658.359238] env[68906]: ERROR nova.compute.manager [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] self.driver.spawn(context, instance, image_meta, [ 1658.359238] env[68906]: ERROR nova.compute.manager [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1658.359238] env[68906]: ERROR nova.compute.manager [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1658.359238] env[68906]: ERROR nova.compute.manager [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1658.359238] env[68906]: ERROR nova.compute.manager [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] self._fetch_image_if_missing(context, vi) [ 1658.359238] env[68906]: ERROR nova.compute.manager [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1658.359698] env[68906]: ERROR nova.compute.manager [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] image_cache(vi, tmp_image_ds_loc) [ 1658.359698] env[68906]: ERROR nova.compute.manager [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1658.359698] env[68906]: ERROR nova.compute.manager [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] vm_util.copy_virtual_disk( [ 1658.359698] env[68906]: ERROR nova.compute.manager [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1658.359698] env[68906]: ERROR nova.compute.manager [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] session._wait_for_task(vmdk_copy_task) [ 1658.359698] env[68906]: ERROR nova.compute.manager [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1658.359698] env[68906]: ERROR nova.compute.manager [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] return self.wait_for_task(task_ref) [ 1658.359698] env[68906]: ERROR nova.compute.manager [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1658.359698] env[68906]: ERROR nova.compute.manager [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] return evt.wait() [ 1658.359698] env[68906]: ERROR nova.compute.manager [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1658.359698] env[68906]: ERROR nova.compute.manager [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] result = hub.switch() [ 1658.359698] env[68906]: ERROR nova.compute.manager [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1658.359698] env[68906]: ERROR nova.compute.manager [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] return self.greenlet.switch() [ 1658.360106] env[68906]: ERROR nova.compute.manager [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1658.360106] env[68906]: ERROR nova.compute.manager [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] self.f(*self.args, **self.kw) [ 1658.360106] env[68906]: ERROR nova.compute.manager [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1658.360106] env[68906]: ERROR nova.compute.manager [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] raise exceptions.translate_fault(task_info.error) [ 1658.360106] env[68906]: ERROR nova.compute.manager [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1658.360106] env[68906]: ERROR nova.compute.manager [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Faults: ['InvalidArgument'] [ 1658.360106] env[68906]: ERROR nova.compute.manager [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] [ 1658.360106] env[68906]: INFO nova.compute.manager [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Terminating instance [ 1658.361187] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1658.361400] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1658.361634] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6e9c337f-ef31-4e17-8b10-dff29e3281ba {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1658.364038] env[68906]: DEBUG nova.compute.manager [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1658.364235] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1658.364956] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a5ef2eb-bf5a-4ec3-9a62-ca4ca9a28e0e {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1658.371694] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Unregistering the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1658.372693] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-f86a308c-f63a-4b44-b2ee-726aafe7c495 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1658.374028] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1658.374206] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68906) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1658.374849] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7b3c3f33-d736-43dd-8d44-512b9742bb7b {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1658.379808] env[68906]: DEBUG oslo_vmware.api [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Waiting for the task: (returnval){ [ 1658.379808] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]524fbbdc-031a-1c31-8abb-84342e49e86c" [ 1658.379808] env[68906]: _type = "Task" [ 1658.379808] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1658.389677] env[68906]: DEBUG oslo_vmware.api [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]524fbbdc-031a-1c31-8abb-84342e49e86c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1658.446120] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Unregistered the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1658.446429] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Deleting contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1658.446631] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Deleting the datastore file [datastore2] 1fdb401a-ac25-4418-803c-fc0b2297f2d4 {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1658.446936] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-13a35e5b-8b07-45c7-b8ab-3670e84fddbc {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1658.453379] env[68906]: DEBUG oslo_vmware.api [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Waiting for the task: (returnval){ [ 1658.453379] env[68906]: value = "task-3475418" [ 1658.453379] env[68906]: _type = "Task" [ 1658.453379] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1658.460840] env[68906]: DEBUG oslo_vmware.api [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Task: {'id': task-3475418, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1658.889864] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Preparing fetch location {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1658.890286] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Creating directory with path [datastore2] vmware_temp/f20a4bf2-c69b-43da-9237-8efa6ad124e1/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1658.890378] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c16c5f0c-04d1-4d72-9d24-377aaee185f6 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1658.901819] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Created directory with path [datastore2] vmware_temp/f20a4bf2-c69b-43da-9237-8efa6ad124e1/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1658.902012] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Fetch image to [datastore2] vmware_temp/f20a4bf2-c69b-43da-9237-8efa6ad124e1/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1658.902191] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to [datastore2] vmware_temp/f20a4bf2-c69b-43da-9237-8efa6ad124e1/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1658.902926] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33ec2218-fe64-4c2a-83a6-5f5499f873f3 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1658.909654] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94a877e4-46fc-4440-909e-836adf86139c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1658.919692] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a3735cfe-d46e-45ee-9892-2d4b1e706527 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1658.951118] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-06f8445a-3d88-4ee1-9446-d82996c0a43b {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1658.958725] env[68906]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-b7a9cf96-c236-4705-a16e-564f1cb6cc25 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1658.962958] env[68906]: DEBUG oslo_vmware.api [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Task: {'id': task-3475418, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.062921} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1658.963520] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Deleted the datastore file {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1658.963731] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Deleted contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1658.963928] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1658.964140] env[68906]: INFO nova.compute.manager [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1658.966299] env[68906]: DEBUG nova.compute.claims [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Aborting claim: {{(pid=68906) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1658.966478] env[68906]: DEBUG oslo_concurrency.lockutils [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1658.966689] env[68906]: DEBUG oslo_concurrency.lockutils [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1658.995461] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1659.052959] env[68906]: DEBUG oslo_vmware.rw_handles [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f20a4bf2-c69b-43da-9237-8efa6ad124e1/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1659.113287] env[68906]: DEBUG oslo_vmware.rw_handles [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Completed reading data from the image iterator. {{(pid=68906) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1659.113490] env[68906]: DEBUG oslo_vmware.rw_handles [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f20a4bf2-c69b-43da-9237-8efa6ad124e1/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1659.254341] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef5da4f8-269d-4a79-8efa-7b355c2c5d16 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1659.263026] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b97956da-ec3f-4168-be7c-8d273ee01b02 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1659.292571] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f1cdf47-b687-42d3-910d-a08cc5a1b615 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1659.299409] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-37070cc9-1264-41da-b4c8-e38034060493 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1659.312791] env[68906]: DEBUG nova.compute.provider_tree [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1659.322481] env[68906]: DEBUG nova.scheduler.client.report [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1659.335857] env[68906]: DEBUG oslo_concurrency.lockutils [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.369s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1659.336430] env[68906]: ERROR nova.compute.manager [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1659.336430] env[68906]: Faults: ['InvalidArgument'] [ 1659.336430] env[68906]: ERROR nova.compute.manager [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Traceback (most recent call last): [ 1659.336430] env[68906]: ERROR nova.compute.manager [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1659.336430] env[68906]: ERROR nova.compute.manager [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] self.driver.spawn(context, instance, image_meta, [ 1659.336430] env[68906]: ERROR nova.compute.manager [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1659.336430] env[68906]: ERROR nova.compute.manager [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1659.336430] env[68906]: ERROR nova.compute.manager [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1659.336430] env[68906]: ERROR nova.compute.manager [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] self._fetch_image_if_missing(context, vi) [ 1659.336430] env[68906]: ERROR nova.compute.manager [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1659.336430] env[68906]: ERROR nova.compute.manager [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] image_cache(vi, tmp_image_ds_loc) [ 1659.336430] env[68906]: ERROR nova.compute.manager [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1659.336845] env[68906]: ERROR nova.compute.manager [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] vm_util.copy_virtual_disk( [ 1659.336845] env[68906]: ERROR nova.compute.manager [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1659.336845] env[68906]: ERROR nova.compute.manager [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] session._wait_for_task(vmdk_copy_task) [ 1659.336845] env[68906]: ERROR nova.compute.manager [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1659.336845] env[68906]: ERROR nova.compute.manager [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] return self.wait_for_task(task_ref) [ 1659.336845] env[68906]: ERROR nova.compute.manager [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1659.336845] env[68906]: ERROR nova.compute.manager [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] return evt.wait() [ 1659.336845] env[68906]: ERROR nova.compute.manager [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1659.336845] env[68906]: ERROR nova.compute.manager [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] result = hub.switch() [ 1659.336845] env[68906]: ERROR nova.compute.manager [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1659.336845] env[68906]: ERROR nova.compute.manager [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] return self.greenlet.switch() [ 1659.336845] env[68906]: ERROR nova.compute.manager [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1659.336845] env[68906]: ERROR nova.compute.manager [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] self.f(*self.args, **self.kw) [ 1659.337246] env[68906]: ERROR nova.compute.manager [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1659.337246] env[68906]: ERROR nova.compute.manager [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] raise exceptions.translate_fault(task_info.error) [ 1659.337246] env[68906]: ERROR nova.compute.manager [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1659.337246] env[68906]: ERROR nova.compute.manager [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Faults: ['InvalidArgument'] [ 1659.337246] env[68906]: ERROR nova.compute.manager [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] [ 1659.337246] env[68906]: DEBUG nova.compute.utils [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] VimFaultException {{(pid=68906) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1659.338726] env[68906]: DEBUG nova.compute.manager [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Build of instance 1fdb401a-ac25-4418-803c-fc0b2297f2d4 was re-scheduled: A specified parameter was not correct: fileType [ 1659.338726] env[68906]: Faults: ['InvalidArgument'] {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1659.339106] env[68906]: DEBUG nova.compute.manager [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Unplugging VIFs for instance {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1659.339281] env[68906]: DEBUG nova.compute.manager [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1659.339448] env[68906]: DEBUG nova.compute.manager [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1659.339611] env[68906]: DEBUG nova.network.neutron [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1659.636455] env[68906]: DEBUG nova.network.neutron [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1659.649240] env[68906]: INFO nova.compute.manager [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Took 0.31 seconds to deallocate network for instance. [ 1659.745327] env[68906]: INFO nova.scheduler.client.report [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Deleted allocations for instance 1fdb401a-ac25-4418-803c-fc0b2297f2d4 [ 1659.769297] env[68906]: DEBUG oslo_concurrency.lockutils [None req-ec74f4e8-e37f-4e29-9aee-a15837533de1 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Lock "1fdb401a-ac25-4418-803c-fc0b2297f2d4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 631.293s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1659.770476] env[68906]: DEBUG oslo_concurrency.lockutils [None req-b8824427-275b-4d8c-bc13-f6c3124d4cfd tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Lock "1fdb401a-ac25-4418-803c-fc0b2297f2d4" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 434.657s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1659.770749] env[68906]: DEBUG oslo_concurrency.lockutils [None req-b8824427-275b-4d8c-bc13-f6c3124d4cfd tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Acquiring lock "1fdb401a-ac25-4418-803c-fc0b2297f2d4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1659.770967] env[68906]: DEBUG oslo_concurrency.lockutils [None req-b8824427-275b-4d8c-bc13-f6c3124d4cfd tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Lock "1fdb401a-ac25-4418-803c-fc0b2297f2d4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1659.771152] env[68906]: DEBUG oslo_concurrency.lockutils [None req-b8824427-275b-4d8c-bc13-f6c3124d4cfd tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Lock "1fdb401a-ac25-4418-803c-fc0b2297f2d4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1659.773117] env[68906]: INFO nova.compute.manager [None req-b8824427-275b-4d8c-bc13-f6c3124d4cfd tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Terminating instance [ 1659.774755] env[68906]: DEBUG nova.compute.manager [None req-b8824427-275b-4d8c-bc13-f6c3124d4cfd tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1659.774945] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-b8824427-275b-4d8c-bc13-f6c3124d4cfd tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1659.775456] env[68906]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-53398821-e56e-4a90-808f-90ecd47734f6 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1659.784548] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-88e22f5f-07a8-4606-b86c-3f16800e2c52 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1659.795253] env[68906]: DEBUG nova.compute.manager [None req-fbc84413-841c-44c5-a0d9-1ff0baa7121b tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: 302e2275-a3ec-48c5-899e-6f385190bfe8] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1659.815556] env[68906]: WARNING nova.virt.vmwareapi.vmops [None req-b8824427-275b-4d8c-bc13-f6c3124d4cfd tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 1fdb401a-ac25-4418-803c-fc0b2297f2d4 could not be found. [ 1659.815750] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-b8824427-275b-4d8c-bc13-f6c3124d4cfd tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1659.815923] env[68906]: INFO nova.compute.manager [None req-b8824427-275b-4d8c-bc13-f6c3124d4cfd tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1659.816202] env[68906]: DEBUG oslo.service.loopingcall [None req-b8824427-275b-4d8c-bc13-f6c3124d4cfd tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1659.816427] env[68906]: DEBUG nova.compute.manager [-] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1659.816525] env[68906]: DEBUG nova.network.neutron [-] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1659.820771] env[68906]: DEBUG nova.compute.manager [None req-fbc84413-841c-44c5-a0d9-1ff0baa7121b tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: 302e2275-a3ec-48c5-899e-6f385190bfe8] Instance disappeared before build. {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1659.838770] env[68906]: DEBUG nova.network.neutron [-] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1659.840717] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fbc84413-841c-44c5-a0d9-1ff0baa7121b tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Lock "302e2275-a3ec-48c5-899e-6f385190bfe8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 204.531s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1659.846878] env[68906]: INFO nova.compute.manager [-] [instance: 1fdb401a-ac25-4418-803c-fc0b2297f2d4] Took 0.03 seconds to deallocate network for instance. [ 1659.851489] env[68906]: DEBUG nova.compute.manager [None req-bf37900d-8159-4835-aaeb-3f0e048a23be tempest-SecurityGroupsTestJSON-973572118 tempest-SecurityGroupsTestJSON-973572118-project-member] [instance: a59ab448-c4f1-4f54-be7a-7e204130f3f8] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1659.873926] env[68906]: DEBUG nova.compute.manager [None req-bf37900d-8159-4835-aaeb-3f0e048a23be tempest-SecurityGroupsTestJSON-973572118 tempest-SecurityGroupsTestJSON-973572118-project-member] [instance: a59ab448-c4f1-4f54-be7a-7e204130f3f8] Instance disappeared before build. {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1659.893190] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bf37900d-8159-4835-aaeb-3f0e048a23be tempest-SecurityGroupsTestJSON-973572118 tempest-SecurityGroupsTestJSON-973572118-project-member] Lock "a59ab448-c4f1-4f54-be7a-7e204130f3f8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 202.739s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1659.902852] env[68906]: DEBUG nova.compute.manager [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1659.952284] env[68906]: DEBUG oslo_concurrency.lockutils [None req-b8824427-275b-4d8c-bc13-f6c3124d4cfd tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Lock "1fdb401a-ac25-4418-803c-fc0b2297f2d4" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.182s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1659.959544] env[68906]: DEBUG oslo_concurrency.lockutils [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1659.959788] env[68906]: DEBUG oslo_concurrency.lockutils [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1659.961298] env[68906]: INFO nova.compute.claims [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1660.171012] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a6331881-4d21-479b-8fab-bd53f3d59cb3 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1660.179861] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-df5c2759-54ae-4711-b568-14516aa81841 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1660.210293] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0274bc12-8b20-47da-afd6-261182261f49 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1660.217438] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5f176b02-4ece-4c0d-a6f6-c576b7cbc66a {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1660.230317] env[68906]: DEBUG nova.compute.provider_tree [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1660.239165] env[68906]: DEBUG nova.scheduler.client.report [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1660.253131] env[68906]: DEBUG oslo_concurrency.lockutils [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.293s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1660.253612] env[68906]: DEBUG nova.compute.manager [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Start building networks asynchronously for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1660.289030] env[68906]: DEBUG nova.compute.utils [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Using /dev/sd instead of None {{(pid=68906) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1660.290622] env[68906]: DEBUG nova.compute.manager [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Allocating IP information in the background. {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1660.290799] env[68906]: DEBUG nova.network.neutron [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] allocate_for_instance() {{(pid=68906) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1660.301837] env[68906]: DEBUG nova.compute.manager [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Start building block device mappings for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1660.357177] env[68906]: DEBUG nova.policy [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'dd0062092c7f46f4a7c7f2731ce2eaed', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0752074183dc4976bd1966f655ce528b', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68906) authorize /opt/stack/nova/nova/policy.py:203}} [ 1660.365887] env[68906]: DEBUG nova.compute.manager [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Start spawning the instance on the hypervisor. {{(pid=68906) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1660.395020] env[68906]: DEBUG nova.virt.hardware [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T13:00:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T13:00:23Z,direct_url=,disk_format='vmdk',id=b1400c31-d33b-4e13-944f-4c645e62493e,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='1ae7bf3a375d41c6af5e7536af51ffd1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T13:00:24Z,virtual_size=,visibility=), allow threads: False {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1660.395020] env[68906]: DEBUG nova.virt.hardware [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Flavor limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1660.395020] env[68906]: DEBUG nova.virt.hardware [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Image limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1660.395233] env[68906]: DEBUG nova.virt.hardware [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Flavor pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1660.395233] env[68906]: DEBUG nova.virt.hardware [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Image pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1660.395233] env[68906]: DEBUG nova.virt.hardware [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1660.395233] env[68906]: DEBUG nova.virt.hardware [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1660.395686] env[68906]: DEBUG nova.virt.hardware [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1660.396272] env[68906]: DEBUG nova.virt.hardware [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Got 1 possible topologies {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1660.396781] env[68906]: DEBUG nova.virt.hardware [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1660.397350] env[68906]: DEBUG nova.virt.hardware [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1660.398967] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dafad054-ede5-45fe-a75f-e0f57ac32d1c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1660.413923] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e16d0569-1506-4e85-aaf6-516dca5307b2 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1660.658554] env[68906]: DEBUG nova.network.neutron [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Successfully created port: ec817996-a603-4e5e-a56d-7dcd99f8fb85 {{(pid=68906) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1661.195963] env[68906]: DEBUG nova.network.neutron [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Successfully updated port: ec817996-a603-4e5e-a56d-7dcd99f8fb85 {{(pid=68906) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1661.208185] env[68906]: DEBUG oslo_concurrency.lockutils [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Acquiring lock "refresh_cache-736db39c-e5e5-4a54-b85a-aa5c703f432e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1661.208351] env[68906]: DEBUG oslo_concurrency.lockutils [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Acquired lock "refresh_cache-736db39c-e5e5-4a54-b85a-aa5c703f432e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1661.208566] env[68906]: DEBUG nova.network.neutron [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Building network info cache for instance {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1661.249019] env[68906]: DEBUG nova.network.neutron [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Instance cache missing network info. {{(pid=68906) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1661.405714] env[68906]: DEBUG nova.network.neutron [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Updating instance_info_cache with network_info: [{"id": "ec817996-a603-4e5e-a56d-7dcd99f8fb85", "address": "fa:16:3e:ea:40:cd", "network": {"id": "b61dca0d-78cb-4639-8786-9c0aa2890aa0", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-1831984153-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0752074183dc4976bd1966f655ce528b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "456bd8a2-0fb6-4b17-9d25-08e7995c5184", "external-id": "nsx-vlan-transportzone-65", "segmentation_id": 65, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapec817996-a6", "ovs_interfaceid": "ec817996-a603-4e5e-a56d-7dcd99f8fb85", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1661.418871] env[68906]: DEBUG oslo_concurrency.lockutils [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Releasing lock "refresh_cache-736db39c-e5e5-4a54-b85a-aa5c703f432e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1661.419269] env[68906]: DEBUG nova.compute.manager [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Instance network_info: |[{"id": "ec817996-a603-4e5e-a56d-7dcd99f8fb85", "address": "fa:16:3e:ea:40:cd", "network": {"id": "b61dca0d-78cb-4639-8786-9c0aa2890aa0", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-1831984153-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0752074183dc4976bd1966f655ce528b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "456bd8a2-0fb6-4b17-9d25-08e7995c5184", "external-id": "nsx-vlan-transportzone-65", "segmentation_id": 65, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapec817996-a6", "ovs_interfaceid": "ec817996-a603-4e5e-a56d-7dcd99f8fb85", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1661.419694] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ea:40:cd', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '456bd8a2-0fb6-4b17-9d25-08e7995c5184', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'ec817996-a603-4e5e-a56d-7dcd99f8fb85', 'vif_model': 'vmxnet3'}] {{(pid=68906) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1661.427517] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Creating folder: Project (0752074183dc4976bd1966f655ce528b). Parent ref: group-v694750. {{(pid=68906) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1661.428080] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-31a5fe28-2b9e-441c-aae1-4548aad27af2 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1661.438562] env[68906]: INFO nova.virt.vmwareapi.vm_util [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Created folder: Project (0752074183dc4976bd1966f655ce528b) in parent group-v694750. [ 1661.438763] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Creating folder: Instances. Parent ref: group-v694840. {{(pid=68906) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1661.438990] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-540217b3-bcef-4cc3-81cc-eacbfc026607 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1661.449035] env[68906]: INFO nova.virt.vmwareapi.vm_util [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Created folder: Instances in parent group-v694840. [ 1661.449035] env[68906]: DEBUG oslo.service.loopingcall [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1661.449275] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Creating VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1661.449480] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-edca5aac-ccb9-4df3-a432-444e4e768960 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1661.468920] env[68906]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1661.468920] env[68906]: value = "task-3475421" [ 1661.468920] env[68906]: _type = "Task" [ 1661.468920] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1661.476325] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475421, 'name': CreateVM_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1661.721548] env[68906]: DEBUG nova.compute.manager [req-429e20a4-1f89-4cce-b543-81d7cee844e3 req-a2f91198-7120-4a16-a863-05edb4795d26 service nova] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Received event network-vif-plugged-ec817996-a603-4e5e-a56d-7dcd99f8fb85 {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1661.721786] env[68906]: DEBUG oslo_concurrency.lockutils [req-429e20a4-1f89-4cce-b543-81d7cee844e3 req-a2f91198-7120-4a16-a863-05edb4795d26 service nova] Acquiring lock "736db39c-e5e5-4a54-b85a-aa5c703f432e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1661.722069] env[68906]: DEBUG oslo_concurrency.lockutils [req-429e20a4-1f89-4cce-b543-81d7cee844e3 req-a2f91198-7120-4a16-a863-05edb4795d26 service nova] Lock "736db39c-e5e5-4a54-b85a-aa5c703f432e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1661.722263] env[68906]: DEBUG oslo_concurrency.lockutils [req-429e20a4-1f89-4cce-b543-81d7cee844e3 req-a2f91198-7120-4a16-a863-05edb4795d26 service nova] Lock "736db39c-e5e5-4a54-b85a-aa5c703f432e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1661.722462] env[68906]: DEBUG nova.compute.manager [req-429e20a4-1f89-4cce-b543-81d7cee844e3 req-a2f91198-7120-4a16-a863-05edb4795d26 service nova] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] No waiting events found dispatching network-vif-plugged-ec817996-a603-4e5e-a56d-7dcd99f8fb85 {{(pid=68906) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1661.722631] env[68906]: WARNING nova.compute.manager [req-429e20a4-1f89-4cce-b543-81d7cee844e3 req-a2f91198-7120-4a16-a863-05edb4795d26 service nova] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Received unexpected event network-vif-plugged-ec817996-a603-4e5e-a56d-7dcd99f8fb85 for instance with vm_state building and task_state spawning. [ 1661.722790] env[68906]: DEBUG nova.compute.manager [req-429e20a4-1f89-4cce-b543-81d7cee844e3 req-a2f91198-7120-4a16-a863-05edb4795d26 service nova] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Received event network-changed-ec817996-a603-4e5e-a56d-7dcd99f8fb85 {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1661.722946] env[68906]: DEBUG nova.compute.manager [req-429e20a4-1f89-4cce-b543-81d7cee844e3 req-a2f91198-7120-4a16-a863-05edb4795d26 service nova] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Refreshing instance network info cache due to event network-changed-ec817996-a603-4e5e-a56d-7dcd99f8fb85. {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1661.723148] env[68906]: DEBUG oslo_concurrency.lockutils [req-429e20a4-1f89-4cce-b543-81d7cee844e3 req-a2f91198-7120-4a16-a863-05edb4795d26 service nova] Acquiring lock "refresh_cache-736db39c-e5e5-4a54-b85a-aa5c703f432e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1661.723287] env[68906]: DEBUG oslo_concurrency.lockutils [req-429e20a4-1f89-4cce-b543-81d7cee844e3 req-a2f91198-7120-4a16-a863-05edb4795d26 service nova] Acquired lock "refresh_cache-736db39c-e5e5-4a54-b85a-aa5c703f432e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1661.723444] env[68906]: DEBUG nova.network.neutron [req-429e20a4-1f89-4cce-b543-81d7cee844e3 req-a2f91198-7120-4a16-a863-05edb4795d26 service nova] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Refreshing network info cache for port ec817996-a603-4e5e-a56d-7dcd99f8fb85 {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1661.957482] env[68906]: DEBUG nova.network.neutron [req-429e20a4-1f89-4cce-b543-81d7cee844e3 req-a2f91198-7120-4a16-a863-05edb4795d26 service nova] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Updated VIF entry in instance network info cache for port ec817996-a603-4e5e-a56d-7dcd99f8fb85. {{(pid=68906) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1661.957828] env[68906]: DEBUG nova.network.neutron [req-429e20a4-1f89-4cce-b543-81d7cee844e3 req-a2f91198-7120-4a16-a863-05edb4795d26 service nova] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Updating instance_info_cache with network_info: [{"id": "ec817996-a603-4e5e-a56d-7dcd99f8fb85", "address": "fa:16:3e:ea:40:cd", "network": {"id": "b61dca0d-78cb-4639-8786-9c0aa2890aa0", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-1831984153-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0752074183dc4976bd1966f655ce528b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "456bd8a2-0fb6-4b17-9d25-08e7995c5184", "external-id": "nsx-vlan-transportzone-65", "segmentation_id": 65, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapec817996-a6", "ovs_interfaceid": "ec817996-a603-4e5e-a56d-7dcd99f8fb85", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1661.968079] env[68906]: DEBUG oslo_concurrency.lockutils [req-429e20a4-1f89-4cce-b543-81d7cee844e3 req-a2f91198-7120-4a16-a863-05edb4795d26 service nova] Releasing lock "refresh_cache-736db39c-e5e5-4a54-b85a-aa5c703f432e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1661.978797] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475421, 'name': CreateVM_Task, 'duration_secs': 0.291701} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1661.978954] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Created VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1661.979578] env[68906]: DEBUG oslo_concurrency.lockutils [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1661.979737] env[68906]: DEBUG oslo_concurrency.lockutils [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1661.980058] env[68906]: DEBUG oslo_concurrency.lockutils [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1661.980319] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f45ab506-b48a-4955-9093-797c04f289f7 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1661.984448] env[68906]: DEBUG oslo_vmware.api [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Waiting for the task: (returnval){ [ 1661.984448] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]52fe59a0-d4d1-61a8-ef03-8f5f07f38d10" [ 1661.984448] env[68906]: _type = "Task" [ 1661.984448] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1661.992269] env[68906]: DEBUG oslo_vmware.api [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]52fe59a0-d4d1-61a8-ef03-8f5f07f38d10, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1662.496376] env[68906]: DEBUG oslo_concurrency.lockutils [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1662.496686] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Processing image b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1662.496887] env[68906]: DEBUG oslo_concurrency.lockutils [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1664.135064] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1665.140779] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1667.140116] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1667.140438] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Starting heal instance info cache {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1667.140438] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Rebuilding the list of instances to heal {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1667.159799] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1667.159963] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1667.160108] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1667.160241] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 89171680-c76d-4826-9236-379542661ffb] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1667.160367] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1667.160501] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1667.160608] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1667.160730] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1667.160851] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1667.160970] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1667.161104] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Didn't find any instances for network info cache update. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1668.140479] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1670.140622] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1670.140988] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1671.140674] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1671.141040] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68906) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1674.141286] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1675.135651] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1675.140437] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager.update_available_resource {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1675.151499] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1675.151801] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1675.151889] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1675.152064] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68906) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1675.153226] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e5c9ab58-8ce5-495c-9fc3-2f750191daa9 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1675.162282] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a90e6f5-8505-4e01-b91a-e5656fac0c33 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1675.175773] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb5db7c9-8b9a-4f45-964f-879e3cbcc425 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1675.181874] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3a78b51f-14bd-477d-9d8c-fdc9df15d70d {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1675.210148] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180968MB free_disk=93GB free_vcpus=48 pci_devices=None {{(pid=68906) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1675.210294] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1675.210482] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1675.278565] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1675.278731] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance e0e595e3-e47e-4cf1-8977-f004eca942d1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1675.278883] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 7466df8a-59a9-49b9-bff7-c4efbeae3eee actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1675.279033] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 89171680-c76d-4826-9236-379542661ffb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1675.279159] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 9b884416-df89-4d8c-b2ab-0667db52a718 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1675.279278] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance aed06616-d008-4695-b66e-9f40acf5ebd3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1675.279398] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 17327bc3-433e-4006-93c7-e53714ed70c2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1675.279515] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 32f5b54d-30bf-4fe9-9622-3ff74344b3f3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1675.279631] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 922d81ba-c8d2-43ba-b1c5-f2943418d6a2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1675.279744] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 736db39c-e5e5-4a54-b85a-aa5c703f432e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1675.289967] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 7faa4c32-7572-4594-a760-e928607bf2b6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1675.299717] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance ce6e5cd6-efb8-46d1-811d-74c084661cce has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1675.309403] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 7994d291-b4bf-48f5-ad34-c1f484d77f6e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1675.318657] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 860248ea-e77b-4ff6-af64-b75f88a31348 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1675.318905] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1675.319076] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1675.470214] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-70b3c888-5b60-4073-967a-a89a9d3077be {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1675.477835] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35f34aec-389e-45e6-9ad6-622df6d6b6d0 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1675.509312] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24e1d422-326c-4345-8501-b08034405a1f {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1675.516459] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-438c11f8-cc3e-49e7-9bd0-d820b54f2b6c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1675.529620] env[68906]: DEBUG nova.compute.provider_tree [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1675.537636] env[68906]: DEBUG nova.scheduler.client.report [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1675.553134] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68906) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1675.553340] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.343s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1678.310309] env[68906]: DEBUG oslo_concurrency.lockutils [None req-04b74860-fe41-4e76-8019-c7a9761fa215 tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Acquiring lock "736db39c-e5e5-4a54-b85a-aa5c703f432e" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1678.996653] env[68906]: DEBUG oslo_concurrency.lockutils [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Acquiring lock "3cfde5a7-3148-426c-8867-ffafb33dc95b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1678.996653] env[68906]: DEBUG oslo_concurrency.lockutils [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Lock "3cfde5a7-3148-426c-8867-ffafb33dc95b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1685.589346] env[68906]: DEBUG oslo_concurrency.lockutils [None req-67ed4526-96df-48ba-bd9f-4546cfb77ff4 tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] Acquiring lock "709defd2-4089-410e-b317-c41c97e01f62" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1685.589684] env[68906]: DEBUG oslo_concurrency.lockutils [None req-67ed4526-96df-48ba-bd9f-4546cfb77ff4 tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] Lock "709defd2-4089-410e-b317-c41c97e01f62" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1707.097281] env[68906]: WARNING oslo_vmware.rw_handles [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1707.097281] env[68906]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1707.097281] env[68906]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1707.097281] env[68906]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1707.097281] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1707.097281] env[68906]: ERROR oslo_vmware.rw_handles response.begin() [ 1707.097281] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1707.097281] env[68906]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1707.097281] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1707.097281] env[68906]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1707.097281] env[68906]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1707.097281] env[68906]: ERROR oslo_vmware.rw_handles [ 1707.098074] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Downloaded image file data b1400c31-d33b-4e13-944f-4c645e62493e to vmware_temp/f20a4bf2-c69b-43da-9237-8efa6ad124e1/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1707.099935] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Caching image {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1707.100208] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Copying Virtual Disk [datastore2] vmware_temp/f20a4bf2-c69b-43da-9237-8efa6ad124e1/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk to [datastore2] vmware_temp/f20a4bf2-c69b-43da-9237-8efa6ad124e1/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk {{(pid=68906) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1707.100571] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-443684ad-e8fa-4d4c-a143-09e9efd3a6a5 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1707.108982] env[68906]: DEBUG oslo_vmware.api [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Waiting for the task: (returnval){ [ 1707.108982] env[68906]: value = "task-3475422" [ 1707.108982] env[68906]: _type = "Task" [ 1707.108982] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1707.116622] env[68906]: DEBUG oslo_vmware.api [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Task: {'id': task-3475422, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1707.619540] env[68906]: DEBUG oslo_vmware.exceptions [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Fault InvalidArgument not matched. {{(pid=68906) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1707.619822] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1707.620430] env[68906]: ERROR nova.compute.manager [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1707.620430] env[68906]: Faults: ['InvalidArgument'] [ 1707.620430] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Traceback (most recent call last): [ 1707.620430] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1707.620430] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] yield resources [ 1707.620430] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1707.620430] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] self.driver.spawn(context, instance, image_meta, [ 1707.620430] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1707.620430] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1707.620430] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1707.620430] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] self._fetch_image_if_missing(context, vi) [ 1707.620430] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1707.620796] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] image_cache(vi, tmp_image_ds_loc) [ 1707.620796] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1707.620796] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] vm_util.copy_virtual_disk( [ 1707.620796] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1707.620796] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] session._wait_for_task(vmdk_copy_task) [ 1707.620796] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1707.620796] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] return self.wait_for_task(task_ref) [ 1707.620796] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1707.620796] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] return evt.wait() [ 1707.620796] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1707.620796] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] result = hub.switch() [ 1707.620796] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1707.620796] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] return self.greenlet.switch() [ 1707.621221] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1707.621221] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] self.f(*self.args, **self.kw) [ 1707.621221] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1707.621221] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] raise exceptions.translate_fault(task_info.error) [ 1707.621221] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1707.621221] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Faults: ['InvalidArgument'] [ 1707.621221] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] [ 1707.621221] env[68906]: INFO nova.compute.manager [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Terminating instance [ 1707.622371] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1707.622580] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1707.622809] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-56035926-6811-4160-9106-9e7b67d13f9d {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1707.626353] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Acquiring lock "refresh_cache-e0e595e3-e47e-4cf1-8977-f004eca942d1" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1707.626514] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Acquired lock "refresh_cache-e0e595e3-e47e-4cf1-8977-f004eca942d1" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1707.626686] env[68906]: DEBUG nova.network.neutron [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Building network info cache for instance {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1707.631076] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1707.631076] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68906) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1707.631520] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-78b5fd5a-4b29-4385-9769-d145c809e274 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1707.638220] env[68906]: DEBUG oslo_vmware.api [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Waiting for the task: (returnval){ [ 1707.638220] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]522c5468-03fb-d3f3-c675-d115500c4755" [ 1707.638220] env[68906]: _type = "Task" [ 1707.638220] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1707.646497] env[68906]: DEBUG oslo_vmware.api [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]522c5468-03fb-d3f3-c675-d115500c4755, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1707.714916] env[68906]: DEBUG nova.network.neutron [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Instance cache missing network info. {{(pid=68906) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1707.773781] env[68906]: DEBUG nova.network.neutron [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1707.783406] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Releasing lock "refresh_cache-e0e595e3-e47e-4cf1-8977-f004eca942d1" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1707.783888] env[68906]: DEBUG nova.compute.manager [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1707.784129] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1707.785257] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-11a205a3-8ce7-4283-b3ed-58d4ef650cba {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1707.793572] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Unregistering the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1707.793799] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-ab68745f-d8b5-495e-af1c-e2b73fd19c5a {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1707.824383] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Unregistered the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1707.824621] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Deleting contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1707.824814] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Deleting the datastore file [datastore2] e0e595e3-e47e-4cf1-8977-f004eca942d1 {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1707.825251] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-54c62301-c1af-429e-abe9-dbd0d5b7d35c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1707.831504] env[68906]: DEBUG oslo_vmware.api [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Waiting for the task: (returnval){ [ 1707.831504] env[68906]: value = "task-3475424" [ 1707.831504] env[68906]: _type = "Task" [ 1707.831504] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1707.838930] env[68906]: DEBUG oslo_vmware.api [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Task: {'id': task-3475424, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1708.149444] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Preparing fetch location {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1708.149444] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Creating directory with path [datastore2] vmware_temp/3255d9a9-c1d1-4deb-b5d5-64da4b1eb652/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1708.149775] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-dbdff929-8b68-43f0-bb5f-def91b7437c7 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1708.160418] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Created directory with path [datastore2] vmware_temp/3255d9a9-c1d1-4deb-b5d5-64da4b1eb652/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1708.160595] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Fetch image to [datastore2] vmware_temp/3255d9a9-c1d1-4deb-b5d5-64da4b1eb652/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1708.160759] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to [datastore2] vmware_temp/3255d9a9-c1d1-4deb-b5d5-64da4b1eb652/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1708.161570] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f9343a62-8d84-4c6b-a978-a977a8fb6d5c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1708.168104] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5475781e-e681-4506-bbe2-96be6a6dd446 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1708.177382] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b6a377c2-9f97-409b-bcdc-ea60e32e4f40 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1708.207984] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ae135731-7504-4e23-84a8-f73ee476f88d {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1708.213170] env[68906]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-2e8d76b2-0623-4622-9374-4992a103ac0d {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1708.232573] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1708.344232] env[68906]: DEBUG oslo_vmware.api [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Task: {'id': task-3475424, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.034104} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1708.347431] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Deleted the datastore file {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1708.347635] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Deleted contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1708.347809] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1708.347981] env[68906]: INFO nova.compute.manager [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Took 0.56 seconds to destroy the instance on the hypervisor. [ 1708.348244] env[68906]: DEBUG oslo.service.loopingcall [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1708.349021] env[68906]: DEBUG nova.compute.manager [-] [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Skipping network deallocation for instance since networking was not requested. {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1708.350647] env[68906]: DEBUG nova.compute.claims [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Aborting claim: {{(pid=68906) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1708.350819] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1708.351051] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1708.370941] env[68906]: DEBUG oslo_vmware.rw_handles [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/3255d9a9-c1d1-4deb-b5d5-64da4b1eb652/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1708.438166] env[68906]: DEBUG oslo_vmware.rw_handles [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Completed reading data from the image iterator. {{(pid=68906) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1708.438464] env[68906]: DEBUG oslo_vmware.rw_handles [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/3255d9a9-c1d1-4deb-b5d5-64da4b1eb652/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1708.595642] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6072f379-c332-44be-b25f-73aec30a823e {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1708.603294] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f7ef5915-b674-4a4e-8f91-7841f5c9915d {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1708.632558] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9947dd2b-af13-43a2-92f5-8529b5915319 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1708.639528] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-566189c2-0ad6-45fd-8e64-78842fd24af9 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1708.653457] env[68906]: DEBUG nova.compute.provider_tree [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1708.662027] env[68906]: DEBUG nova.scheduler.client.report [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1708.674982] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.324s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1708.675543] env[68906]: ERROR nova.compute.manager [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1708.675543] env[68906]: Faults: ['InvalidArgument'] [ 1708.675543] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Traceback (most recent call last): [ 1708.675543] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1708.675543] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] self.driver.spawn(context, instance, image_meta, [ 1708.675543] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1708.675543] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1708.675543] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1708.675543] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] self._fetch_image_if_missing(context, vi) [ 1708.675543] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1708.675543] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] image_cache(vi, tmp_image_ds_loc) [ 1708.675543] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1708.675875] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] vm_util.copy_virtual_disk( [ 1708.675875] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1708.675875] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] session._wait_for_task(vmdk_copy_task) [ 1708.675875] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1708.675875] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] return self.wait_for_task(task_ref) [ 1708.675875] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1708.675875] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] return evt.wait() [ 1708.675875] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1708.675875] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] result = hub.switch() [ 1708.675875] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1708.675875] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] return self.greenlet.switch() [ 1708.675875] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1708.675875] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] self.f(*self.args, **self.kw) [ 1708.676256] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1708.676256] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] raise exceptions.translate_fault(task_info.error) [ 1708.676256] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1708.676256] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Faults: ['InvalidArgument'] [ 1708.676256] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] [ 1708.676256] env[68906]: DEBUG nova.compute.utils [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] VimFaultException {{(pid=68906) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1708.677902] env[68906]: DEBUG nova.compute.manager [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Build of instance e0e595e3-e47e-4cf1-8977-f004eca942d1 was re-scheduled: A specified parameter was not correct: fileType [ 1708.677902] env[68906]: Faults: ['InvalidArgument'] {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1708.678293] env[68906]: DEBUG nova.compute.manager [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Unplugging VIFs for instance {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1708.678519] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Acquiring lock "refresh_cache-e0e595e3-e47e-4cf1-8977-f004eca942d1" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1708.678666] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Acquired lock "refresh_cache-e0e595e3-e47e-4cf1-8977-f004eca942d1" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1708.678826] env[68906]: DEBUG nova.network.neutron [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Building network info cache for instance {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1708.702300] env[68906]: DEBUG nova.network.neutron [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Instance cache missing network info. {{(pid=68906) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1708.768539] env[68906]: DEBUG nova.network.neutron [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1708.777430] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Releasing lock "refresh_cache-e0e595e3-e47e-4cf1-8977-f004eca942d1" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1708.777653] env[68906]: DEBUG nova.compute.manager [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1708.777833] env[68906]: DEBUG nova.compute.manager [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Skipping network deallocation for instance since networking was not requested. {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1708.863602] env[68906]: INFO nova.scheduler.client.report [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Deleted allocations for instance e0e595e3-e47e-4cf1-8977-f004eca942d1 [ 1708.881287] env[68906]: DEBUG oslo_concurrency.lockutils [None req-bc278641-e9dc-495f-b85d-d30d2e1da09f tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Lock "e0e595e3-e47e-4cf1-8977-f004eca942d1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 622.190s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1708.882922] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6f158b16-c95b-47ba-98ef-cd5e461b3344 tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Lock "e0e595e3-e47e-4cf1-8977-f004eca942d1" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 426.040s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1708.882922] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6f158b16-c95b-47ba-98ef-cd5e461b3344 tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Acquiring lock "e0e595e3-e47e-4cf1-8977-f004eca942d1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1708.882922] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6f158b16-c95b-47ba-98ef-cd5e461b3344 tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Lock "e0e595e3-e47e-4cf1-8977-f004eca942d1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1708.882922] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6f158b16-c95b-47ba-98ef-cd5e461b3344 tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Lock "e0e595e3-e47e-4cf1-8977-f004eca942d1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1708.884775] env[68906]: INFO nova.compute.manager [None req-6f158b16-c95b-47ba-98ef-cd5e461b3344 tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Terminating instance [ 1708.886228] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6f158b16-c95b-47ba-98ef-cd5e461b3344 tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Acquiring lock "refresh_cache-e0e595e3-e47e-4cf1-8977-f004eca942d1" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1708.886383] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6f158b16-c95b-47ba-98ef-cd5e461b3344 tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Acquired lock "refresh_cache-e0e595e3-e47e-4cf1-8977-f004eca942d1" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1708.886551] env[68906]: DEBUG nova.network.neutron [None req-6f158b16-c95b-47ba-98ef-cd5e461b3344 tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Building network info cache for instance {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1708.896577] env[68906]: DEBUG nova.compute.manager [None req-70df84c4-75b7-4312-8ee4-e32fe1fafc2b tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] [instance: 7faa4c32-7572-4594-a760-e928607bf2b6] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1708.915300] env[68906]: DEBUG nova.network.neutron [None req-6f158b16-c95b-47ba-98ef-cd5e461b3344 tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Instance cache missing network info. {{(pid=68906) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1708.926376] env[68906]: DEBUG nova.compute.manager [None req-70df84c4-75b7-4312-8ee4-e32fe1fafc2b tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] [instance: 7faa4c32-7572-4594-a760-e928607bf2b6] Instance disappeared before build. {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1708.947480] env[68906]: DEBUG oslo_concurrency.lockutils [None req-70df84c4-75b7-4312-8ee4-e32fe1fafc2b tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] Lock "7faa4c32-7572-4594-a760-e928607bf2b6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 220.493s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1708.956162] env[68906]: DEBUG nova.compute.manager [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1709.015926] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1709.016252] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1709.017881] env[68906]: INFO nova.compute.claims [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1709.165535] env[68906]: DEBUG nova.network.neutron [None req-6f158b16-c95b-47ba-98ef-cd5e461b3344 tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1709.174782] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6f158b16-c95b-47ba-98ef-cd5e461b3344 tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Releasing lock "refresh_cache-e0e595e3-e47e-4cf1-8977-f004eca942d1" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1709.175192] env[68906]: DEBUG nova.compute.manager [None req-6f158b16-c95b-47ba-98ef-cd5e461b3344 tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1709.175482] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-6f158b16-c95b-47ba-98ef-cd5e461b3344 tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1709.175928] env[68906]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-6960e070-fa26-4c85-897c-e43058fb452b {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1709.187879] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7eb086a4-2747-40b2-b6e3-17c8552c6c32 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1709.219288] env[68906]: WARNING nova.virt.vmwareapi.vmops [None req-6f158b16-c95b-47ba-98ef-cd5e461b3344 tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance e0e595e3-e47e-4cf1-8977-f004eca942d1 could not be found. [ 1709.219493] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-6f158b16-c95b-47ba-98ef-cd5e461b3344 tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1709.219673] env[68906]: INFO nova.compute.manager [None req-6f158b16-c95b-47ba-98ef-cd5e461b3344 tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1709.219914] env[68906]: DEBUG oslo.service.loopingcall [None req-6f158b16-c95b-47ba-98ef-cd5e461b3344 tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1709.222195] env[68906]: DEBUG nova.compute.manager [-] [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1709.222298] env[68906]: DEBUG nova.network.neutron [-] [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1709.275715] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fe902024-cc2f-4825-bf40-8ba038b1ec2c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1709.283659] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b5b21118-386f-4b28-ac24-56d8042674ed {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1709.313415] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a892556c-4eba-42ad-95be-38bfba11151a {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1709.320651] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-60a7a3e1-7225-4745-8674-037892233f46 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1709.333443] env[68906]: DEBUG nova.compute.provider_tree [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1709.342201] env[68906]: DEBUG nova.scheduler.client.report [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1709.355112] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.339s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1709.355500] env[68906]: DEBUG nova.compute.manager [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Start building networks asynchronously for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1709.359313] env[68906]: DEBUG neutronclient.v2_0.client [-] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=68906) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1709.359564] env[68906]: ERROR nova.network.neutron [-] Neutron client was not able to generate a valid admin token, please verify Neutron admin credential located in nova.conf: neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1709.360044] env[68906]: ERROR oslo.service.loopingcall [-] Dynamic interval looping call 'oslo_service.loopingcall.RetryDecorator.__call__.._func' failed: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1709.360044] env[68906]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1709.360044] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1709.360044] env[68906]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1709.360044] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1709.360044] env[68906]: ERROR oslo.service.loopingcall exception_handler_v20(status_code, error_body) [ 1709.360044] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1709.360044] env[68906]: ERROR oslo.service.loopingcall raise client_exc(message=error_message, [ 1709.360044] env[68906]: ERROR oslo.service.loopingcall neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1709.360044] env[68906]: ERROR oslo.service.loopingcall Neutron server returns request_ids: ['req-9d8d6def-d305-4ec1-84fe-e3c63733db54'] [ 1709.360044] env[68906]: ERROR oslo.service.loopingcall [ 1709.360044] env[68906]: ERROR oslo.service.loopingcall During handling of the above exception, another exception occurred: [ 1709.360044] env[68906]: ERROR oslo.service.loopingcall [ 1709.360044] env[68906]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1709.360044] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1709.360044] env[68906]: ERROR oslo.service.loopingcall result = func(*self.args, **self.kw) [ 1709.360539] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1709.360539] env[68906]: ERROR oslo.service.loopingcall result = f(*args, **kwargs) [ 1709.360539] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1709.360539] env[68906]: ERROR oslo.service.loopingcall self._deallocate_network( [ 1709.360539] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1709.360539] env[68906]: ERROR oslo.service.loopingcall self.network_api.deallocate_for_instance( [ 1709.360539] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1709.360539] env[68906]: ERROR oslo.service.loopingcall data = neutron.list_ports(**search_opts) [ 1709.360539] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1709.360539] env[68906]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1709.360539] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1709.360539] env[68906]: ERROR oslo.service.loopingcall return self.list('ports', self.ports_path, retrieve_all, [ 1709.360539] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1709.360539] env[68906]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1709.360539] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1709.360539] env[68906]: ERROR oslo.service.loopingcall for r in self._pagination(collection, path, **params): [ 1709.360539] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1709.360539] env[68906]: ERROR oslo.service.loopingcall res = self.get(path, params=params) [ 1709.361070] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1709.361070] env[68906]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1709.361070] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1709.361070] env[68906]: ERROR oslo.service.loopingcall return self.retry_request("GET", action, body=body, [ 1709.361070] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1709.361070] env[68906]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1709.361070] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1709.361070] env[68906]: ERROR oslo.service.loopingcall return self.do_request(method, action, body=body, [ 1709.361070] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1709.361070] env[68906]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1709.361070] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1709.361070] env[68906]: ERROR oslo.service.loopingcall self._handle_fault_response(status_code, replybody, resp) [ 1709.361070] env[68906]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1709.361070] env[68906]: ERROR oslo.service.loopingcall raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1709.361070] env[68906]: ERROR oslo.service.loopingcall nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1709.361070] env[68906]: ERROR oslo.service.loopingcall [ 1709.361539] env[68906]: ERROR nova.compute.manager [None req-6f158b16-c95b-47ba-98ef-cd5e461b3344 tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Failed to deallocate network for instance. Error: Networking client is experiencing an unauthorized exception.: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1709.389818] env[68906]: ERROR nova.compute.manager [None req-6f158b16-c95b-47ba-98ef-cd5e461b3344 tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Setting instance vm_state to ERROR: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1709.389818] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Traceback (most recent call last): [ 1709.389818] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1709.389818] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] ret = obj(*args, **kwargs) [ 1709.389818] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1709.389818] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] exception_handler_v20(status_code, error_body) [ 1709.389818] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1709.389818] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] raise client_exc(message=error_message, [ 1709.389818] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1709.389818] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Neutron server returns request_ids: ['req-9d8d6def-d305-4ec1-84fe-e3c63733db54'] [ 1709.389818] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] [ 1709.390355] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] During handling of the above exception, another exception occurred: [ 1709.390355] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] [ 1709.390355] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Traceback (most recent call last): [ 1709.390355] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 1709.390355] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] self._delete_instance(context, instance, bdms) [ 1709.390355] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 1709.390355] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] self._shutdown_instance(context, instance, bdms) [ 1709.390355] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 1709.390355] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] self._try_deallocate_network(context, instance, requested_networks) [ 1709.390355] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 1709.390355] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] with excutils.save_and_reraise_exception(): [ 1709.390355] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1709.390355] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] self.force_reraise() [ 1709.390802] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1709.390802] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] raise self.value [ 1709.390802] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 1709.390802] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] _deallocate_network_with_retries() [ 1709.390802] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1709.390802] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] return evt.wait() [ 1709.390802] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1709.390802] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] result = hub.switch() [ 1709.390802] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1709.390802] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] return self.greenlet.switch() [ 1709.390802] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1709.390802] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] result = func(*self.args, **self.kw) [ 1709.391188] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1709.391188] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] result = f(*args, **kwargs) [ 1709.391188] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1709.391188] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] self._deallocate_network( [ 1709.391188] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1709.391188] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] self.network_api.deallocate_for_instance( [ 1709.391188] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1709.391188] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] data = neutron.list_ports(**search_opts) [ 1709.391188] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1709.391188] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] ret = obj(*args, **kwargs) [ 1709.391188] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1709.391188] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] return self.list('ports', self.ports_path, retrieve_all, [ 1709.391188] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1709.391588] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] ret = obj(*args, **kwargs) [ 1709.391588] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1709.391588] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] for r in self._pagination(collection, path, **params): [ 1709.391588] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1709.391588] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] res = self.get(path, params=params) [ 1709.391588] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1709.391588] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] ret = obj(*args, **kwargs) [ 1709.391588] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1709.391588] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] return self.retry_request("GET", action, body=body, [ 1709.391588] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1709.391588] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] ret = obj(*args, **kwargs) [ 1709.391588] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1709.391588] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] return self.do_request(method, action, body=body, [ 1709.391996] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1709.391996] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] ret = obj(*args, **kwargs) [ 1709.391996] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1709.391996] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] self._handle_fault_response(status_code, replybody, resp) [ 1709.391996] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1709.391996] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1709.391996] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1709.391996] env[68906]: ERROR nova.compute.manager [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] [ 1709.393648] env[68906]: DEBUG nova.compute.utils [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Using /dev/sd instead of None {{(pid=68906) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1709.394983] env[68906]: DEBUG nova.compute.manager [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Allocating IP information in the background. {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1709.395206] env[68906]: DEBUG nova.network.neutron [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] allocate_for_instance() {{(pid=68906) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1709.403428] env[68906]: DEBUG nova.compute.manager [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Start building block device mappings for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1709.433193] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6f158b16-c95b-47ba-98ef-cd5e461b3344 tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Lock "e0e595e3-e47e-4cf1-8977-f004eca942d1" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.551s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1709.459915] env[68906]: DEBUG nova.policy [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b46c06fcd3404f45abc083563415467b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'da1df204e7064662bf5c15a1598c0d4e', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68906) authorize /opt/stack/nova/nova/policy.py:203}} [ 1709.485050] env[68906]: DEBUG nova.compute.manager [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Start spawning the instance on the hypervisor. {{(pid=68906) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1709.488437] env[68906]: INFO nova.compute.manager [None req-6f158b16-c95b-47ba-98ef-cd5e461b3344 tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] [instance: e0e595e3-e47e-4cf1-8977-f004eca942d1] Successfully reverted task state from None on failure for instance. [ 1709.491681] env[68906]: ERROR oslo_messaging.rpc.server [None req-6f158b16-c95b-47ba-98ef-cd5e461b3344 tempest-ServerShowV254Test-1577433897 tempest-ServerShowV254Test-1577433897-project-member] Exception during message handling: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1709.491681] env[68906]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1709.491681] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1709.491681] env[68906]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1709.491681] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1709.491681] env[68906]: ERROR oslo_messaging.rpc.server exception_handler_v20(status_code, error_body) [ 1709.491681] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1709.491681] env[68906]: ERROR oslo_messaging.rpc.server raise client_exc(message=error_message, [ 1709.491681] env[68906]: ERROR oslo_messaging.rpc.server neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1709.491681] env[68906]: ERROR oslo_messaging.rpc.server Neutron server returns request_ids: ['req-9d8d6def-d305-4ec1-84fe-e3c63733db54'] [ 1709.491681] env[68906]: ERROR oslo_messaging.rpc.server [ 1709.491681] env[68906]: ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred: [ 1709.491681] env[68906]: ERROR oslo_messaging.rpc.server [ 1709.491681] env[68906]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1709.491681] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming [ 1709.491681] env[68906]: ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) [ 1709.492252] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch [ 1709.492252] env[68906]: ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) [ 1709.492252] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch [ 1709.492252] env[68906]: ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) [ 1709.492252] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 65, in wrapped [ 1709.492252] env[68906]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1709.492252] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1709.492252] env[68906]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1709.492252] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1709.492252] env[68906]: ERROR oslo_messaging.rpc.server raise self.value [ 1709.492252] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 63, in wrapped [ 1709.492252] env[68906]: ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) [ 1709.492252] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 166, in decorated_function [ 1709.492252] env[68906]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1709.492252] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1709.492252] env[68906]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1709.492252] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1709.492252] env[68906]: ERROR oslo_messaging.rpc.server raise self.value [ 1709.492768] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 157, in decorated_function [ 1709.492768] env[68906]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1709.492768] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/utils.py", line 1439, in decorated_function [ 1709.492768] env[68906]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1709.492768] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 213, in decorated_function [ 1709.492768] env[68906]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1709.492768] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1709.492768] env[68906]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1709.492768] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1709.492768] env[68906]: ERROR oslo_messaging.rpc.server raise self.value [ 1709.492768] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 203, in decorated_function [ 1709.492768] env[68906]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1709.492768] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3327, in terminate_instance [ 1709.492768] env[68906]: ERROR oslo_messaging.rpc.server do_terminate_instance(instance, bdms) [ 1709.492768] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1709.492768] env[68906]: ERROR oslo_messaging.rpc.server return f(*args, **kwargs) [ 1709.492768] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3322, in do_terminate_instance [ 1709.492768] env[68906]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1709.493284] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1709.493284] env[68906]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1709.493284] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1709.493284] env[68906]: ERROR oslo_messaging.rpc.server raise self.value [ 1709.493284] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 1709.493284] env[68906]: ERROR oslo_messaging.rpc.server self._delete_instance(context, instance, bdms) [ 1709.493284] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 1709.493284] env[68906]: ERROR oslo_messaging.rpc.server self._shutdown_instance(context, instance, bdms) [ 1709.493284] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 1709.493284] env[68906]: ERROR oslo_messaging.rpc.server self._try_deallocate_network(context, instance, requested_networks) [ 1709.493284] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 1709.493284] env[68906]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1709.493284] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1709.493284] env[68906]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1709.493284] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1709.493284] env[68906]: ERROR oslo_messaging.rpc.server raise self.value [ 1709.493284] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 1709.493284] env[68906]: ERROR oslo_messaging.rpc.server _deallocate_network_with_retries() [ 1709.493826] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1709.493826] env[68906]: ERROR oslo_messaging.rpc.server return evt.wait() [ 1709.493826] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1709.493826] env[68906]: ERROR oslo_messaging.rpc.server result = hub.switch() [ 1709.493826] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1709.493826] env[68906]: ERROR oslo_messaging.rpc.server return self.greenlet.switch() [ 1709.493826] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1709.493826] env[68906]: ERROR oslo_messaging.rpc.server result = func(*self.args, **self.kw) [ 1709.493826] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1709.493826] env[68906]: ERROR oslo_messaging.rpc.server result = f(*args, **kwargs) [ 1709.493826] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1709.493826] env[68906]: ERROR oslo_messaging.rpc.server self._deallocate_network( [ 1709.493826] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1709.493826] env[68906]: ERROR oslo_messaging.rpc.server self.network_api.deallocate_for_instance( [ 1709.493826] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1709.493826] env[68906]: ERROR oslo_messaging.rpc.server data = neutron.list_ports(**search_opts) [ 1709.493826] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1709.493826] env[68906]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1709.494376] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1709.494376] env[68906]: ERROR oslo_messaging.rpc.server return self.list('ports', self.ports_path, retrieve_all, [ 1709.494376] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1709.494376] env[68906]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1709.494376] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1709.494376] env[68906]: ERROR oslo_messaging.rpc.server for r in self._pagination(collection, path, **params): [ 1709.494376] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1709.494376] env[68906]: ERROR oslo_messaging.rpc.server res = self.get(path, params=params) [ 1709.494376] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1709.494376] env[68906]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1709.494376] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1709.494376] env[68906]: ERROR oslo_messaging.rpc.server return self.retry_request("GET", action, body=body, [ 1709.494376] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1709.494376] env[68906]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1709.494376] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1709.494376] env[68906]: ERROR oslo_messaging.rpc.server return self.do_request(method, action, body=body, [ 1709.494376] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1709.494376] env[68906]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1709.495042] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1709.495042] env[68906]: ERROR oslo_messaging.rpc.server self._handle_fault_response(status_code, replybody, resp) [ 1709.495042] env[68906]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1709.495042] env[68906]: ERROR oslo_messaging.rpc.server raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1709.495042] env[68906]: ERROR oslo_messaging.rpc.server nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1709.495042] env[68906]: ERROR oslo_messaging.rpc.server [ 1709.509641] env[68906]: DEBUG nova.virt.hardware [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T13:00:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T13:00:23Z,direct_url=,disk_format='vmdk',id=b1400c31-d33b-4e13-944f-4c645e62493e,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='1ae7bf3a375d41c6af5e7536af51ffd1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T13:00:24Z,virtual_size=,visibility=), allow threads: False {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1709.509945] env[68906]: DEBUG nova.virt.hardware [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Flavor limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1709.510214] env[68906]: DEBUG nova.virt.hardware [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Image limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1709.510506] env[68906]: DEBUG nova.virt.hardware [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Flavor pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1709.510747] env[68906]: DEBUG nova.virt.hardware [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Image pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1709.510990] env[68906]: DEBUG nova.virt.hardware [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1709.511250] env[68906]: DEBUG nova.virt.hardware [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1709.511417] env[68906]: DEBUG nova.virt.hardware [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1709.511583] env[68906]: DEBUG nova.virt.hardware [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Got 1 possible topologies {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1709.511743] env[68906]: DEBUG nova.virt.hardware [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1709.511911] env[68906]: DEBUG nova.virt.hardware [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1709.512742] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71a03520-9d83-4a9b-88e0-48ff5cd036c9 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1709.520721] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-88e2c47f-f9ca-4fbf-b92a-1f85ef0add90 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1709.828114] env[68906]: DEBUG nova.network.neutron [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Successfully created port: 0d935ad7-db44-4bc0-98a0-a5253ee6f5c1 {{(pid=68906) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1710.358998] env[68906]: DEBUG nova.compute.manager [req-cafc3850-977e-4f11-9c10-79d6f8761c21 req-dcd70671-f608-4d3e-9682-f3ddbaa6b099 service nova] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Received event network-vif-plugged-0d935ad7-db44-4bc0-98a0-a5253ee6f5c1 {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1710.359278] env[68906]: DEBUG oslo_concurrency.lockutils [req-cafc3850-977e-4f11-9c10-79d6f8761c21 req-dcd70671-f608-4d3e-9682-f3ddbaa6b099 service nova] Acquiring lock "ce6e5cd6-efb8-46d1-811d-74c084661cce-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1710.359437] env[68906]: DEBUG oslo_concurrency.lockutils [req-cafc3850-977e-4f11-9c10-79d6f8761c21 req-dcd70671-f608-4d3e-9682-f3ddbaa6b099 service nova] Lock "ce6e5cd6-efb8-46d1-811d-74c084661cce-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1710.359598] env[68906]: DEBUG oslo_concurrency.lockutils [req-cafc3850-977e-4f11-9c10-79d6f8761c21 req-dcd70671-f608-4d3e-9682-f3ddbaa6b099 service nova] Lock "ce6e5cd6-efb8-46d1-811d-74c084661cce-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1710.359763] env[68906]: DEBUG nova.compute.manager [req-cafc3850-977e-4f11-9c10-79d6f8761c21 req-dcd70671-f608-4d3e-9682-f3ddbaa6b099 service nova] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] No waiting events found dispatching network-vif-plugged-0d935ad7-db44-4bc0-98a0-a5253ee6f5c1 {{(pid=68906) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1710.359921] env[68906]: WARNING nova.compute.manager [req-cafc3850-977e-4f11-9c10-79d6f8761c21 req-dcd70671-f608-4d3e-9682-f3ddbaa6b099 service nova] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Received unexpected event network-vif-plugged-0d935ad7-db44-4bc0-98a0-a5253ee6f5c1 for instance with vm_state building and task_state spawning. [ 1710.418194] env[68906]: DEBUG nova.network.neutron [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Successfully updated port: 0d935ad7-db44-4bc0-98a0-a5253ee6f5c1 {{(pid=68906) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1710.433424] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Acquiring lock "refresh_cache-ce6e5cd6-efb8-46d1-811d-74c084661cce" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1710.433584] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Acquired lock "refresh_cache-ce6e5cd6-efb8-46d1-811d-74c084661cce" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1710.433737] env[68906]: DEBUG nova.network.neutron [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Building network info cache for instance {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1710.473744] env[68906]: DEBUG nova.network.neutron [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Instance cache missing network info. {{(pid=68906) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1710.632330] env[68906]: DEBUG nova.network.neutron [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Updating instance_info_cache with network_info: [{"id": "0d935ad7-db44-4bc0-98a0-a5253ee6f5c1", "address": "fa:16:3e:34:ca:3a", "network": {"id": "cbbbf860-58e5-4164-8de8-b1492ffc7605", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1183077938-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "da1df204e7064662bf5c15a1598c0d4e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6eb7e3e9-5cc2-40f1-a6eb-f70f06531667", "external-id": "nsx-vlan-transportzone-938", "segmentation_id": 938, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0d935ad7-db", "ovs_interfaceid": "0d935ad7-db44-4bc0-98a0-a5253ee6f5c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1710.642525] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Releasing lock "refresh_cache-ce6e5cd6-efb8-46d1-811d-74c084661cce" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1710.642796] env[68906]: DEBUG nova.compute.manager [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Instance network_info: |[{"id": "0d935ad7-db44-4bc0-98a0-a5253ee6f5c1", "address": "fa:16:3e:34:ca:3a", "network": {"id": "cbbbf860-58e5-4164-8de8-b1492ffc7605", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1183077938-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "da1df204e7064662bf5c15a1598c0d4e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6eb7e3e9-5cc2-40f1-a6eb-f70f06531667", "external-id": "nsx-vlan-transportzone-938", "segmentation_id": 938, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0d935ad7-db", "ovs_interfaceid": "0d935ad7-db44-4bc0-98a0-a5253ee6f5c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1710.643252] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:34:ca:3a', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '6eb7e3e9-5cc2-40f1-a6eb-f70f06531667', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '0d935ad7-db44-4bc0-98a0-a5253ee6f5c1', 'vif_model': 'vmxnet3'}] {{(pid=68906) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1710.650755] env[68906]: DEBUG oslo.service.loopingcall [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1710.651183] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Creating VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1710.651406] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-f24db33c-643f-4b0b-a944-d409bbbfe819 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1710.671390] env[68906]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1710.671390] env[68906]: value = "task-3475425" [ 1710.671390] env[68906]: _type = "Task" [ 1710.671390] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1710.678907] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475425, 'name': CreateVM_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1711.181454] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475425, 'name': CreateVM_Task, 'duration_secs': 0.305094} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1711.181642] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Created VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1711.182397] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1711.182567] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1711.182896] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1711.183170] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1ad5e16a-e8ad-4c0e-bb86-619f39d101d2 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1711.187496] env[68906]: DEBUG oslo_vmware.api [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Waiting for the task: (returnval){ [ 1711.187496] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]52571ae0-2358-afd3-ebfc-6fadfead2c78" [ 1711.187496] env[68906]: _type = "Task" [ 1711.187496] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1711.198323] env[68906]: DEBUG oslo_vmware.api [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]52571ae0-2358-afd3-ebfc-6fadfead2c78, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1711.698256] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1711.698731] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Processing image b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1711.698731] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1712.385054] env[68906]: DEBUG nova.compute.manager [req-97325af0-0efa-430c-acfe-d50c02609cd6 req-6c1b9e11-2b48-41c4-bb15-6f8da3fcd829 service nova] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Received event network-changed-0d935ad7-db44-4bc0-98a0-a5253ee6f5c1 {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1712.385287] env[68906]: DEBUG nova.compute.manager [req-97325af0-0efa-430c-acfe-d50c02609cd6 req-6c1b9e11-2b48-41c4-bb15-6f8da3fcd829 service nova] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Refreshing instance network info cache due to event network-changed-0d935ad7-db44-4bc0-98a0-a5253ee6f5c1. {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1712.385529] env[68906]: DEBUG oslo_concurrency.lockutils [req-97325af0-0efa-430c-acfe-d50c02609cd6 req-6c1b9e11-2b48-41c4-bb15-6f8da3fcd829 service nova] Acquiring lock "refresh_cache-ce6e5cd6-efb8-46d1-811d-74c084661cce" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1712.385675] env[68906]: DEBUG oslo_concurrency.lockutils [req-97325af0-0efa-430c-acfe-d50c02609cd6 req-6c1b9e11-2b48-41c4-bb15-6f8da3fcd829 service nova] Acquired lock "refresh_cache-ce6e5cd6-efb8-46d1-811d-74c084661cce" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1712.385838] env[68906]: DEBUG nova.network.neutron [req-97325af0-0efa-430c-acfe-d50c02609cd6 req-6c1b9e11-2b48-41c4-bb15-6f8da3fcd829 service nova] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Refreshing network info cache for port 0d935ad7-db44-4bc0-98a0-a5253ee6f5c1 {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1712.621108] env[68906]: DEBUG nova.network.neutron [req-97325af0-0efa-430c-acfe-d50c02609cd6 req-6c1b9e11-2b48-41c4-bb15-6f8da3fcd829 service nova] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Updated VIF entry in instance network info cache for port 0d935ad7-db44-4bc0-98a0-a5253ee6f5c1. {{(pid=68906) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1712.622032] env[68906]: DEBUG nova.network.neutron [req-97325af0-0efa-430c-acfe-d50c02609cd6 req-6c1b9e11-2b48-41c4-bb15-6f8da3fcd829 service nova] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Updating instance_info_cache with network_info: [{"id": "0d935ad7-db44-4bc0-98a0-a5253ee6f5c1", "address": "fa:16:3e:34:ca:3a", "network": {"id": "cbbbf860-58e5-4164-8de8-b1492ffc7605", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1183077938-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "da1df204e7064662bf5c15a1598c0d4e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6eb7e3e9-5cc2-40f1-a6eb-f70f06531667", "external-id": "nsx-vlan-transportzone-938", "segmentation_id": 938, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0d935ad7-db", "ovs_interfaceid": "0d935ad7-db44-4bc0-98a0-a5253ee6f5c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1712.634701] env[68906]: DEBUG oslo_concurrency.lockutils [req-97325af0-0efa-430c-acfe-d50c02609cd6 req-6c1b9e11-2b48-41c4-bb15-6f8da3fcd829 service nova] Releasing lock "refresh_cache-ce6e5cd6-efb8-46d1-811d-74c084661cce" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1726.554057] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1728.141586] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1728.141975] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Starting heal instance info cache {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1728.141975] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Rebuilding the list of instances to heal {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1728.165751] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1728.165976] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1728.166138] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 89171680-c76d-4826-9236-379542661ffb] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1728.166271] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1728.166397] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1728.166520] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1728.166641] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1728.166759] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1728.166876] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1728.166992] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1728.167128] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Didn't find any instances for network info cache update. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1729.140308] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1730.140990] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1732.141996] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1732.142274] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1732.142433] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68906) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1734.142607] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1737.136606] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1737.140282] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager.update_available_resource {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1737.153183] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1737.153412] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1737.153580] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1737.153738] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68906) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1737.154897] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f5a0314d-c1b6-40c3-95d6-09688ca67afd {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1737.164263] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e00e311-d90d-429a-a710-436a5774a229 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1737.178158] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ebcdf0f-1c70-47fa-969c-fa1a531746b8 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1737.184253] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7fda115-90f0-4cc9-8477-29b2ea35c056 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1737.213888] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180962MB free_disk=93GB free_vcpus=48 pci_devices=None {{(pid=68906) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1737.214034] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1737.214223] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1737.286013] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1737.286202] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 7466df8a-59a9-49b9-bff7-c4efbeae3eee actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1737.286330] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 89171680-c76d-4826-9236-379542661ffb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1737.286482] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 9b884416-df89-4d8c-b2ab-0667db52a718 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1737.286614] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance aed06616-d008-4695-b66e-9f40acf5ebd3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1737.286733] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 17327bc3-433e-4006-93c7-e53714ed70c2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1737.286848] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 32f5b54d-30bf-4fe9-9622-3ff74344b3f3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1737.286963] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 922d81ba-c8d2-43ba-b1c5-f2943418d6a2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1737.287089] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 736db39c-e5e5-4a54-b85a-aa5c703f432e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1737.287204] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance ce6e5cd6-efb8-46d1-811d-74c084661cce actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1737.297750] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 7994d291-b4bf-48f5-ad34-c1f484d77f6e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1737.307656] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 860248ea-e77b-4ff6-af64-b75f88a31348 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1737.317611] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 3cfde5a7-3148-426c-8867-ffafb33dc95b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1737.327501] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 709defd2-4089-410e-b317-c41c97e01f62 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1737.327732] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1737.327878] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1737.494878] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b7c43c7-2991-4ccc-bc9f-2a73975b650f {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1737.502437] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e80c49a-0a79-43c0-84b0-1705ee230f26 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1737.531544] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c2825e9-591d-4476-9102-830e12599cf8 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1737.538455] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6db889e4-6822-4641-948f-94ad2b762b3c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1737.551459] env[68906]: DEBUG nova.compute.provider_tree [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1737.559287] env[68906]: DEBUG nova.scheduler.client.report [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1737.573372] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68906) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1737.573568] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.359s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1755.460056] env[68906]: WARNING oslo_vmware.rw_handles [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1755.460056] env[68906]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1755.460056] env[68906]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1755.460056] env[68906]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1755.460056] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1755.460056] env[68906]: ERROR oslo_vmware.rw_handles response.begin() [ 1755.460056] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1755.460056] env[68906]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1755.460056] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1755.460056] env[68906]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1755.460056] env[68906]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1755.460056] env[68906]: ERROR oslo_vmware.rw_handles [ 1755.460056] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Downloaded image file data b1400c31-d33b-4e13-944f-4c645e62493e to vmware_temp/3255d9a9-c1d1-4deb-b5d5-64da4b1eb652/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1755.461851] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Caching image {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1755.462126] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Copying Virtual Disk [datastore2] vmware_temp/3255d9a9-c1d1-4deb-b5d5-64da4b1eb652/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk to [datastore2] vmware_temp/3255d9a9-c1d1-4deb-b5d5-64da4b1eb652/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk {{(pid=68906) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1755.462430] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-3c84ac87-3a5c-4493-9ae7-3dcc23ab6a76 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1755.471414] env[68906]: DEBUG oslo_vmware.api [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Waiting for the task: (returnval){ [ 1755.471414] env[68906]: value = "task-3475426" [ 1755.471414] env[68906]: _type = "Task" [ 1755.471414] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1755.479441] env[68906]: DEBUG oslo_vmware.api [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Task: {'id': task-3475426, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1755.981540] env[68906]: DEBUG oslo_vmware.exceptions [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Fault InvalidArgument not matched. {{(pid=68906) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1755.981820] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1755.982407] env[68906]: ERROR nova.compute.manager [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1755.982407] env[68906]: Faults: ['InvalidArgument'] [ 1755.982407] env[68906]: ERROR nova.compute.manager [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Traceback (most recent call last): [ 1755.982407] env[68906]: ERROR nova.compute.manager [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1755.982407] env[68906]: ERROR nova.compute.manager [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] yield resources [ 1755.982407] env[68906]: ERROR nova.compute.manager [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1755.982407] env[68906]: ERROR nova.compute.manager [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] self.driver.spawn(context, instance, image_meta, [ 1755.982407] env[68906]: ERROR nova.compute.manager [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1755.982407] env[68906]: ERROR nova.compute.manager [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1755.982407] env[68906]: ERROR nova.compute.manager [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1755.982407] env[68906]: ERROR nova.compute.manager [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] self._fetch_image_if_missing(context, vi) [ 1755.982407] env[68906]: ERROR nova.compute.manager [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1755.982815] env[68906]: ERROR nova.compute.manager [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] image_cache(vi, tmp_image_ds_loc) [ 1755.982815] env[68906]: ERROR nova.compute.manager [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1755.982815] env[68906]: ERROR nova.compute.manager [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] vm_util.copy_virtual_disk( [ 1755.982815] env[68906]: ERROR nova.compute.manager [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1755.982815] env[68906]: ERROR nova.compute.manager [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] session._wait_for_task(vmdk_copy_task) [ 1755.982815] env[68906]: ERROR nova.compute.manager [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1755.982815] env[68906]: ERROR nova.compute.manager [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] return self.wait_for_task(task_ref) [ 1755.982815] env[68906]: ERROR nova.compute.manager [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1755.982815] env[68906]: ERROR nova.compute.manager [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] return evt.wait() [ 1755.982815] env[68906]: ERROR nova.compute.manager [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1755.982815] env[68906]: ERROR nova.compute.manager [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] result = hub.switch() [ 1755.982815] env[68906]: ERROR nova.compute.manager [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1755.982815] env[68906]: ERROR nova.compute.manager [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] return self.greenlet.switch() [ 1755.983243] env[68906]: ERROR nova.compute.manager [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1755.983243] env[68906]: ERROR nova.compute.manager [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] self.f(*self.args, **self.kw) [ 1755.983243] env[68906]: ERROR nova.compute.manager [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1755.983243] env[68906]: ERROR nova.compute.manager [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] raise exceptions.translate_fault(task_info.error) [ 1755.983243] env[68906]: ERROR nova.compute.manager [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1755.983243] env[68906]: ERROR nova.compute.manager [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Faults: ['InvalidArgument'] [ 1755.983243] env[68906]: ERROR nova.compute.manager [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] [ 1755.983243] env[68906]: INFO nova.compute.manager [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Terminating instance [ 1755.985597] env[68906]: DEBUG nova.compute.manager [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1755.985797] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1755.986093] env[68906]: DEBUG oslo_concurrency.lockutils [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1755.986292] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1755.987125] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d94651ea-33cc-40e3-a46e-63c5a218b89a {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1755.989831] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ecb7aac2-b09e-4e3e-b337-3b7f6d913849 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1755.995692] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Unregistering the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1755.995913] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-b0d97723-be21-4dfb-9c6d-f7c6a479d27d {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1755.998026] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1755.998209] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68906) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1755.999146] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-077d2d0a-3372-4976-8bdf-4ea0d28bf959 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1756.003650] env[68906]: DEBUG oslo_vmware.api [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Waiting for the task: (returnval){ [ 1756.003650] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]52b48282-0cde-083f-a51c-c6e5f432272a" [ 1756.003650] env[68906]: _type = "Task" [ 1756.003650] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1756.010958] env[68906]: DEBUG oslo_vmware.api [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]52b48282-0cde-083f-a51c-c6e5f432272a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1756.061475] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Unregistered the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1756.061686] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Deleting contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1756.061870] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Deleting the datastore file [datastore2] 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09 {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1756.062144] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-33921e6f-1a4b-4218-8e38-e179873c856b {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1756.067511] env[68906]: DEBUG oslo_vmware.api [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Waiting for the task: (returnval){ [ 1756.067511] env[68906]: value = "task-3475428" [ 1756.067511] env[68906]: _type = "Task" [ 1756.067511] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1756.075177] env[68906]: DEBUG oslo_vmware.api [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Task: {'id': task-3475428, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1756.514080] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Preparing fetch location {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1756.514419] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Creating directory with path [datastore2] vmware_temp/5b64a95f-47e8-4c41-bd9f-b3db1571dc42/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1756.514589] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e82d6eba-7e03-45c6-b220-a31dd06181da {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1756.525554] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Created directory with path [datastore2] vmware_temp/5b64a95f-47e8-4c41-bd9f-b3db1571dc42/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1756.525742] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Fetch image to [datastore2] vmware_temp/5b64a95f-47e8-4c41-bd9f-b3db1571dc42/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1756.525911] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to [datastore2] vmware_temp/5b64a95f-47e8-4c41-bd9f-b3db1571dc42/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1756.526644] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32ed539d-4ed2-47ff-8137-0e1c416f09d9 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1756.533129] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6bd3e81e-aaa8-40ec-909f-077497b1ae47 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1756.541838] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93a7def9-f8ef-4345-8788-be14c4a645b2 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1756.575067] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62dfc2a3-1b8d-4c2d-b546-d2ffd16cbddc {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1756.581698] env[68906]: DEBUG oslo_vmware.api [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Task: {'id': task-3475428, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.073982} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1756.583032] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Deleted the datastore file {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1756.583224] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Deleted contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1756.583397] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1756.583572] env[68906]: INFO nova.compute.manager [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1756.585323] env[68906]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-92b07001-9ed4-43e4-a29d-ddab4ec886d8 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1756.587121] env[68906]: DEBUG nova.compute.claims [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Aborting claim: {{(pid=68906) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1756.587292] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1756.587506] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1756.607495] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1756.720532] env[68906]: DEBUG oslo_vmware.rw_handles [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5b64a95f-47e8-4c41-bd9f-b3db1571dc42/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1756.779547] env[68906]: DEBUG oslo_vmware.rw_handles [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Completed reading data from the image iterator. {{(pid=68906) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1756.779809] env[68906]: DEBUG oslo_vmware.rw_handles [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5b64a95f-47e8-4c41-bd9f-b3db1571dc42/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1756.849234] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1d7ee632-13ef-4ea3-96d9-3431dd8f1b40 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1756.856914] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f9e2e440-6cb9-4e2e-a9f8-14b30fdee06a {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1756.886605] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e5a97d1d-39ec-4c07-abdd-e4de93f04705 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1756.893868] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f2734ba2-0e23-4acd-84d2-68fa3dd82956 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1756.906971] env[68906]: DEBUG nova.compute.provider_tree [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1756.915808] env[68906]: DEBUG nova.scheduler.client.report [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1756.934409] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.347s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1756.934963] env[68906]: ERROR nova.compute.manager [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1756.934963] env[68906]: Faults: ['InvalidArgument'] [ 1756.934963] env[68906]: ERROR nova.compute.manager [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Traceback (most recent call last): [ 1756.934963] env[68906]: ERROR nova.compute.manager [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1756.934963] env[68906]: ERROR nova.compute.manager [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] self.driver.spawn(context, instance, image_meta, [ 1756.934963] env[68906]: ERROR nova.compute.manager [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1756.934963] env[68906]: ERROR nova.compute.manager [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1756.934963] env[68906]: ERROR nova.compute.manager [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1756.934963] env[68906]: ERROR nova.compute.manager [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] self._fetch_image_if_missing(context, vi) [ 1756.934963] env[68906]: ERROR nova.compute.manager [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1756.934963] env[68906]: ERROR nova.compute.manager [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] image_cache(vi, tmp_image_ds_loc) [ 1756.934963] env[68906]: ERROR nova.compute.manager [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1756.935432] env[68906]: ERROR nova.compute.manager [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] vm_util.copy_virtual_disk( [ 1756.935432] env[68906]: ERROR nova.compute.manager [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1756.935432] env[68906]: ERROR nova.compute.manager [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] session._wait_for_task(vmdk_copy_task) [ 1756.935432] env[68906]: ERROR nova.compute.manager [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1756.935432] env[68906]: ERROR nova.compute.manager [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] return self.wait_for_task(task_ref) [ 1756.935432] env[68906]: ERROR nova.compute.manager [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1756.935432] env[68906]: ERROR nova.compute.manager [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] return evt.wait() [ 1756.935432] env[68906]: ERROR nova.compute.manager [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1756.935432] env[68906]: ERROR nova.compute.manager [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] result = hub.switch() [ 1756.935432] env[68906]: ERROR nova.compute.manager [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1756.935432] env[68906]: ERROR nova.compute.manager [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] return self.greenlet.switch() [ 1756.935432] env[68906]: ERROR nova.compute.manager [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1756.935432] env[68906]: ERROR nova.compute.manager [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] self.f(*self.args, **self.kw) [ 1756.935772] env[68906]: ERROR nova.compute.manager [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1756.935772] env[68906]: ERROR nova.compute.manager [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] raise exceptions.translate_fault(task_info.error) [ 1756.935772] env[68906]: ERROR nova.compute.manager [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1756.935772] env[68906]: ERROR nova.compute.manager [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Faults: ['InvalidArgument'] [ 1756.935772] env[68906]: ERROR nova.compute.manager [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] [ 1756.935772] env[68906]: DEBUG nova.compute.utils [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] VimFaultException {{(pid=68906) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1756.937212] env[68906]: DEBUG nova.compute.manager [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Build of instance 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09 was re-scheduled: A specified parameter was not correct: fileType [ 1756.937212] env[68906]: Faults: ['InvalidArgument'] {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1756.937582] env[68906]: DEBUG nova.compute.manager [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Unplugging VIFs for instance {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1756.937754] env[68906]: DEBUG nova.compute.manager [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1756.937927] env[68906]: DEBUG nova.compute.manager [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1756.938102] env[68906]: DEBUG nova.network.neutron [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1757.251254] env[68906]: DEBUG nova.network.neutron [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1757.261863] env[68906]: INFO nova.compute.manager [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Took 0.32 seconds to deallocate network for instance. [ 1757.350410] env[68906]: INFO nova.scheduler.client.report [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Deleted allocations for instance 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09 [ 1757.372896] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8cc1b37d-3cc0-4e00-8b4e-1ea1e183df21 tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Lock "6c28f571-e74a-48f9-9cc7-a9e4ddea8b09" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 677.061s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1757.374018] env[68906]: DEBUG oslo_concurrency.lockutils [None req-df13a5bb-f088-42b3-ad46-a3d1262b27bf tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Lock "6c28f571-e74a-48f9-9cc7-a9e4ddea8b09" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 480.362s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1757.374249] env[68906]: DEBUG oslo_concurrency.lockutils [None req-df13a5bb-f088-42b3-ad46-a3d1262b27bf tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Acquiring lock "6c28f571-e74a-48f9-9cc7-a9e4ddea8b09-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1757.374452] env[68906]: DEBUG oslo_concurrency.lockutils [None req-df13a5bb-f088-42b3-ad46-a3d1262b27bf tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Lock "6c28f571-e74a-48f9-9cc7-a9e4ddea8b09-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1757.374616] env[68906]: DEBUG oslo_concurrency.lockutils [None req-df13a5bb-f088-42b3-ad46-a3d1262b27bf tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Lock "6c28f571-e74a-48f9-9cc7-a9e4ddea8b09-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1757.376510] env[68906]: INFO nova.compute.manager [None req-df13a5bb-f088-42b3-ad46-a3d1262b27bf tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Terminating instance [ 1757.378129] env[68906]: DEBUG nova.compute.manager [None req-df13a5bb-f088-42b3-ad46-a3d1262b27bf tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1757.378330] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-df13a5bb-f088-42b3-ad46-a3d1262b27bf tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1757.378807] env[68906]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-e99f93ca-d4b7-4589-aa2e-74ecafe7da92 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1757.388342] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b592ce47-57d3-43ed-ae00-8753f4e7e305 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1757.399347] env[68906]: DEBUG nova.compute.manager [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1757.421025] env[68906]: WARNING nova.virt.vmwareapi.vmops [None req-df13a5bb-f088-42b3-ad46-a3d1262b27bf tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09 could not be found. [ 1757.421239] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-df13a5bb-f088-42b3-ad46-a3d1262b27bf tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1757.421421] env[68906]: INFO nova.compute.manager [None req-df13a5bb-f088-42b3-ad46-a3d1262b27bf tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1757.421662] env[68906]: DEBUG oslo.service.loopingcall [None req-df13a5bb-f088-42b3-ad46-a3d1262b27bf tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1757.421886] env[68906]: DEBUG nova.compute.manager [-] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1757.421997] env[68906]: DEBUG nova.network.neutron [-] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1757.457154] env[68906]: DEBUG nova.network.neutron [-] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1757.460356] env[68906]: DEBUG oslo_concurrency.lockutils [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1757.460602] env[68906]: DEBUG oslo_concurrency.lockutils [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1757.462089] env[68906]: INFO nova.compute.claims [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1757.465320] env[68906]: INFO nova.compute.manager [-] [instance: 6c28f571-e74a-48f9-9cc7-a9e4ddea8b09] Took 0.04 seconds to deallocate network for instance. [ 1757.556351] env[68906]: DEBUG oslo_concurrency.lockutils [None req-df13a5bb-f088-42b3-ad46-a3d1262b27bf tempest-ServerAddressesTestJSON-218061599 tempest-ServerAddressesTestJSON-218061599-project-member] Lock "6c28f571-e74a-48f9-9cc7-a9e4ddea8b09" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.182s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1757.656793] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cff38cbe-00c8-41f7-bf63-33b6bd234c19 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1757.664152] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e499563-1402-4f72-93d2-fde7d181d19a {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1757.694320] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b2feeb1c-5ac6-493f-882f-8b6776bc5453 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1757.701151] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-711eb3cf-b965-4ee8-95c3-0486ab84177a {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1757.713988] env[68906]: DEBUG nova.compute.provider_tree [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1757.722704] env[68906]: DEBUG nova.scheduler.client.report [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1757.736598] env[68906]: DEBUG oslo_concurrency.lockutils [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.276s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1757.737093] env[68906]: DEBUG nova.compute.manager [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Start building networks asynchronously for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1757.768267] env[68906]: DEBUG nova.compute.utils [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Using /dev/sd instead of None {{(pid=68906) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1757.769582] env[68906]: DEBUG nova.compute.manager [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Allocating IP information in the background. {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1757.769778] env[68906]: DEBUG nova.network.neutron [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] allocate_for_instance() {{(pid=68906) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1757.777714] env[68906]: DEBUG nova.compute.manager [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Start building block device mappings for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1757.833495] env[68906]: DEBUG nova.policy [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c25d6fe77124440f8c46f12e9508a49d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e41caca4bbc74bdb9f3c35b4b6e2c5ba', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68906) authorize /opt/stack/nova/nova/policy.py:203}} [ 1757.842368] env[68906]: DEBUG nova.compute.manager [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Start spawning the instance on the hypervisor. {{(pid=68906) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1757.866175] env[68906]: DEBUG nova.virt.hardware [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T13:00:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T13:00:23Z,direct_url=,disk_format='vmdk',id=b1400c31-d33b-4e13-944f-4c645e62493e,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='1ae7bf3a375d41c6af5e7536af51ffd1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T13:00:24Z,virtual_size=,visibility=), allow threads: False {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1757.866462] env[68906]: DEBUG nova.virt.hardware [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Flavor limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1757.866663] env[68906]: DEBUG nova.virt.hardware [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Image limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1757.866890] env[68906]: DEBUG nova.virt.hardware [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Flavor pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1757.867063] env[68906]: DEBUG nova.virt.hardware [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Image pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1757.867217] env[68906]: DEBUG nova.virt.hardware [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1757.867426] env[68906]: DEBUG nova.virt.hardware [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1757.867585] env[68906]: DEBUG nova.virt.hardware [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1757.867756] env[68906]: DEBUG nova.virt.hardware [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Got 1 possible topologies {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1757.867944] env[68906]: DEBUG nova.virt.hardware [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1757.868176] env[68906]: DEBUG nova.virt.hardware [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1757.869056] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b8e5e99a-4d9d-40b2-a0d0-0c19866e34c9 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1757.876583] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d90c6817-be21-46b8-88e5-e4666dc41b90 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1758.125873] env[68906]: DEBUG nova.network.neutron [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Successfully created port: 7590ab0e-7e73-49c1-ac51-135d3da9fa74 {{(pid=68906) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1758.711140] env[68906]: DEBUG nova.network.neutron [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Successfully updated port: 7590ab0e-7e73-49c1-ac51-135d3da9fa74 {{(pid=68906) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1758.723080] env[68906]: DEBUG oslo_concurrency.lockutils [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Acquiring lock "refresh_cache-7994d291-b4bf-48f5-ad34-c1f484d77f6e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1758.723254] env[68906]: DEBUG oslo_concurrency.lockutils [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Acquired lock "refresh_cache-7994d291-b4bf-48f5-ad34-c1f484d77f6e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1758.723412] env[68906]: DEBUG nova.network.neutron [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Building network info cache for instance {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1758.763507] env[68906]: DEBUG nova.network.neutron [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Instance cache missing network info. {{(pid=68906) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1758.916409] env[68906]: DEBUG nova.network.neutron [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Updating instance_info_cache with network_info: [{"id": "7590ab0e-7e73-49c1-ac51-135d3da9fa74", "address": "fa:16:3e:6b:60:1c", "network": {"id": "a09558df-600d-4d90-bd9a-7e731b33ecc6", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-1811894926-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e41caca4bbc74bdb9f3c35b4b6e2c5ba", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "aa410d21-2141-45bb-8d0b-16c77304605f", "external-id": "nsx-vlan-transportzone-886", "segmentation_id": 886, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7590ab0e-7e", "ovs_interfaceid": "7590ab0e-7e73-49c1-ac51-135d3da9fa74", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1758.929325] env[68906]: DEBUG oslo_concurrency.lockutils [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Releasing lock "refresh_cache-7994d291-b4bf-48f5-ad34-c1f484d77f6e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1758.929596] env[68906]: DEBUG nova.compute.manager [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Instance network_info: |[{"id": "7590ab0e-7e73-49c1-ac51-135d3da9fa74", "address": "fa:16:3e:6b:60:1c", "network": {"id": "a09558df-600d-4d90-bd9a-7e731b33ecc6", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-1811894926-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e41caca4bbc74bdb9f3c35b4b6e2c5ba", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "aa410d21-2141-45bb-8d0b-16c77304605f", "external-id": "nsx-vlan-transportzone-886", "segmentation_id": 886, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7590ab0e-7e", "ovs_interfaceid": "7590ab0e-7e73-49c1-ac51-135d3da9fa74", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1758.929998] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:6b:60:1c', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'aa410d21-2141-45bb-8d0b-16c77304605f', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '7590ab0e-7e73-49c1-ac51-135d3da9fa74', 'vif_model': 'vmxnet3'}] {{(pid=68906) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1758.937416] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Creating folder: Project (e41caca4bbc74bdb9f3c35b4b6e2c5ba). Parent ref: group-v694750. {{(pid=68906) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1758.937943] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-92983529-af4e-423a-bb5e-534c4a363274 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1758.949464] env[68906]: INFO nova.virt.vmwareapi.vm_util [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Created folder: Project (e41caca4bbc74bdb9f3c35b4b6e2c5ba) in parent group-v694750. [ 1758.949654] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Creating folder: Instances. Parent ref: group-v694844. {{(pid=68906) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1758.949920] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d14134e3-4d87-4294-aa8c-83d0755e1a86 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1758.958326] env[68906]: INFO nova.virt.vmwareapi.vm_util [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Created folder: Instances in parent group-v694844. [ 1758.958326] env[68906]: DEBUG oslo.service.loopingcall [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1758.958455] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Creating VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1758.958536] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-1f55d8aa-e1a2-432d-acd2-650346ec3b07 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1758.976292] env[68906]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1758.976292] env[68906]: value = "task-3475431" [ 1758.976292] env[68906]: _type = "Task" [ 1758.976292] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1758.983241] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475431, 'name': CreateVM_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1759.344301] env[68906]: DEBUG nova.compute.manager [req-1b3d6ba7-2555-452b-97e6-4ba71aa6d4fe req-ef71f117-ceac-4600-a9ff-1ac745bb0aa8 service nova] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Received event network-vif-plugged-7590ab0e-7e73-49c1-ac51-135d3da9fa74 {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1759.344497] env[68906]: DEBUG oslo_concurrency.lockutils [req-1b3d6ba7-2555-452b-97e6-4ba71aa6d4fe req-ef71f117-ceac-4600-a9ff-1ac745bb0aa8 service nova] Acquiring lock "7994d291-b4bf-48f5-ad34-c1f484d77f6e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1759.344707] env[68906]: DEBUG oslo_concurrency.lockutils [req-1b3d6ba7-2555-452b-97e6-4ba71aa6d4fe req-ef71f117-ceac-4600-a9ff-1ac745bb0aa8 service nova] Lock "7994d291-b4bf-48f5-ad34-c1f484d77f6e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1759.345049] env[68906]: DEBUG oslo_concurrency.lockutils [req-1b3d6ba7-2555-452b-97e6-4ba71aa6d4fe req-ef71f117-ceac-4600-a9ff-1ac745bb0aa8 service nova] Lock "7994d291-b4bf-48f5-ad34-c1f484d77f6e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1759.345049] env[68906]: DEBUG nova.compute.manager [req-1b3d6ba7-2555-452b-97e6-4ba71aa6d4fe req-ef71f117-ceac-4600-a9ff-1ac745bb0aa8 service nova] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] No waiting events found dispatching network-vif-plugged-7590ab0e-7e73-49c1-ac51-135d3da9fa74 {{(pid=68906) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1759.345206] env[68906]: WARNING nova.compute.manager [req-1b3d6ba7-2555-452b-97e6-4ba71aa6d4fe req-ef71f117-ceac-4600-a9ff-1ac745bb0aa8 service nova] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Received unexpected event network-vif-plugged-7590ab0e-7e73-49c1-ac51-135d3da9fa74 for instance with vm_state building and task_state spawning. [ 1759.345365] env[68906]: DEBUG nova.compute.manager [req-1b3d6ba7-2555-452b-97e6-4ba71aa6d4fe req-ef71f117-ceac-4600-a9ff-1ac745bb0aa8 service nova] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Received event network-changed-7590ab0e-7e73-49c1-ac51-135d3da9fa74 {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1759.345517] env[68906]: DEBUG nova.compute.manager [req-1b3d6ba7-2555-452b-97e6-4ba71aa6d4fe req-ef71f117-ceac-4600-a9ff-1ac745bb0aa8 service nova] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Refreshing instance network info cache due to event network-changed-7590ab0e-7e73-49c1-ac51-135d3da9fa74. {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1759.345696] env[68906]: DEBUG oslo_concurrency.lockutils [req-1b3d6ba7-2555-452b-97e6-4ba71aa6d4fe req-ef71f117-ceac-4600-a9ff-1ac745bb0aa8 service nova] Acquiring lock "refresh_cache-7994d291-b4bf-48f5-ad34-c1f484d77f6e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1759.345831] env[68906]: DEBUG oslo_concurrency.lockutils [req-1b3d6ba7-2555-452b-97e6-4ba71aa6d4fe req-ef71f117-ceac-4600-a9ff-1ac745bb0aa8 service nova] Acquired lock "refresh_cache-7994d291-b4bf-48f5-ad34-c1f484d77f6e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1759.345992] env[68906]: DEBUG nova.network.neutron [req-1b3d6ba7-2555-452b-97e6-4ba71aa6d4fe req-ef71f117-ceac-4600-a9ff-1ac745bb0aa8 service nova] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Refreshing network info cache for port 7590ab0e-7e73-49c1-ac51-135d3da9fa74 {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1759.488436] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475431, 'name': CreateVM_Task, 'duration_secs': 0.29513} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1759.488608] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Created VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1759.489375] env[68906]: DEBUG oslo_concurrency.lockutils [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1759.489601] env[68906]: DEBUG oslo_concurrency.lockutils [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1759.489929] env[68906]: DEBUG oslo_concurrency.lockutils [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1759.490195] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5b574541-726d-4685-b4ea-46e400a08b34 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1759.494253] env[68906]: DEBUG oslo_vmware.api [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Waiting for the task: (returnval){ [ 1759.494253] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]52d10c56-d5c2-6093-4180-d9248493bb77" [ 1759.494253] env[68906]: _type = "Task" [ 1759.494253] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1759.501452] env[68906]: DEBUG oslo_vmware.api [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]52d10c56-d5c2-6093-4180-d9248493bb77, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1759.591365] env[68906]: DEBUG nova.network.neutron [req-1b3d6ba7-2555-452b-97e6-4ba71aa6d4fe req-ef71f117-ceac-4600-a9ff-1ac745bb0aa8 service nova] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Updated VIF entry in instance network info cache for port 7590ab0e-7e73-49c1-ac51-135d3da9fa74. {{(pid=68906) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1759.591706] env[68906]: DEBUG nova.network.neutron [req-1b3d6ba7-2555-452b-97e6-4ba71aa6d4fe req-ef71f117-ceac-4600-a9ff-1ac745bb0aa8 service nova] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Updating instance_info_cache with network_info: [{"id": "7590ab0e-7e73-49c1-ac51-135d3da9fa74", "address": "fa:16:3e:6b:60:1c", "network": {"id": "a09558df-600d-4d90-bd9a-7e731b33ecc6", "bridge": "br-int", "label": "tempest-ServerTagsTestJSON-1811894926-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e41caca4bbc74bdb9f3c35b4b6e2c5ba", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "aa410d21-2141-45bb-8d0b-16c77304605f", "external-id": "nsx-vlan-transportzone-886", "segmentation_id": 886, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7590ab0e-7e", "ovs_interfaceid": "7590ab0e-7e73-49c1-ac51-135d3da9fa74", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1759.601392] env[68906]: DEBUG oslo_concurrency.lockutils [req-1b3d6ba7-2555-452b-97e6-4ba71aa6d4fe req-ef71f117-ceac-4600-a9ff-1ac745bb0aa8 service nova] Releasing lock "refresh_cache-7994d291-b4bf-48f5-ad34-c1f484d77f6e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1760.004415] env[68906]: DEBUG oslo_concurrency.lockutils [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1760.004715] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Processing image b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1760.004877] env[68906]: DEBUG oslo_concurrency.lockutils [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1773.578719] env[68906]: DEBUG oslo_concurrency.lockutils [None req-36d3bf0a-44ab-4bd7-93e3-cdff2857f986 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Acquiring lock "ce6e5cd6-efb8-46d1-811d-74c084661cce" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1773.915559] env[68906]: DEBUG oslo_concurrency.lockutils [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Acquiring lock "01b79dfa-cd20-495d-b112-8429c28b741e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1773.915559] env[68906]: DEBUG oslo_concurrency.lockutils [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Lock "01b79dfa-cd20-495d-b112-8429c28b741e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1785.142994] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1785.143276] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Cleaning up deleted instances with incomplete migration {{(pid=68906) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 1786.150208] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1788.136817] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1788.162452] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1788.162667] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Starting heal instance info cache {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1788.162823] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Rebuilding the list of instances to heal {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1788.184307] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1788.184485] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 89171680-c76d-4826-9236-379542661ffb] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1788.184592] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1788.184717] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1788.184843] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1788.184968] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1788.185107] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1788.185232] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1788.185351] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1788.185469] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1788.185591] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Didn't find any instances for network info cache update. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1790.140607] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1790.141015] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1792.140547] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1793.140769] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1793.141125] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68906) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1793.141237] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1795.149080] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1798.136130] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1799.141707] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager.update_available_resource {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1799.153843] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1799.154096] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1799.154269] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1799.154424] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68906) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1799.155560] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35a8f5f1-0a55-461f-9458-07d6505e8cfa {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1799.165084] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fa8a8de5-39e4-4125-a4a3-e02c539d6edd {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1799.179207] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bb6f1489-347c-40b0-ab4a-cfb8dafc4a83 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1799.185424] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b8e47d6-b045-4199-b854-0e92bdadfef4 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1799.215684] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180945MB free_disk=93GB free_vcpus=48 pci_devices=None {{(pid=68906) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1799.215848] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1799.216063] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1799.358278] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 7466df8a-59a9-49b9-bff7-c4efbeae3eee actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1799.358454] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 89171680-c76d-4826-9236-379542661ffb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1799.358588] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 9b884416-df89-4d8c-b2ab-0667db52a718 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1799.358712] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance aed06616-d008-4695-b66e-9f40acf5ebd3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1799.358833] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 17327bc3-433e-4006-93c7-e53714ed70c2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1799.358964] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 32f5b54d-30bf-4fe9-9622-3ff74344b3f3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1799.359131] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 922d81ba-c8d2-43ba-b1c5-f2943418d6a2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1799.359255] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 736db39c-e5e5-4a54-b85a-aa5c703f432e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1799.359373] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance ce6e5cd6-efb8-46d1-811d-74c084661cce actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1799.359488] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 7994d291-b4bf-48f5-ad34-c1f484d77f6e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1799.371554] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 860248ea-e77b-4ff6-af64-b75f88a31348 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1799.382138] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 3cfde5a7-3148-426c-8867-ffafb33dc95b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1799.392170] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 709defd2-4089-410e-b317-c41c97e01f62 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1799.401409] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 01b79dfa-cd20-495d-b112-8429c28b741e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1799.401633] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1799.401781] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1799.417292] env[68906]: DEBUG nova.scheduler.client.report [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Refreshing inventories for resource provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1799.430477] env[68906]: DEBUG nova.scheduler.client.report [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Updating ProviderTree inventory for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1799.430655] env[68906]: DEBUG nova.compute.provider_tree [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Updating inventory in ProviderTree for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1799.440950] env[68906]: DEBUG nova.scheduler.client.report [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Refreshing aggregate associations for resource provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b, aggregates: None {{(pid=68906) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1799.457056] env[68906]: DEBUG nova.scheduler.client.report [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Refreshing trait associations for resource provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b, traits: COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ISO {{(pid=68906) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1799.600688] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0e69d19b-dea1-4c68-9b35-ad1eacc5fba4 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1799.608374] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5420a650-90ae-449b-ba49-7f806dba07c6 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1799.639763] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09937456-7ee7-4538-9245-317ec74487c9 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1799.646774] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b19e2c08-97cd-4407-b8dc-750a610aca7a {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1799.660220] env[68906]: DEBUG nova.compute.provider_tree [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1799.668168] env[68906]: DEBUG nova.scheduler.client.report [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1799.683975] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68906) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1799.684186] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.468s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1800.141077] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1800.141262] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Cleaning up deleted instances {{(pid=68906) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 1800.151037] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] There are 0 instances to clean {{(pid=68906) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 1806.037406] env[68906]: WARNING oslo_vmware.rw_handles [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1806.037406] env[68906]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1806.037406] env[68906]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1806.037406] env[68906]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1806.037406] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1806.037406] env[68906]: ERROR oslo_vmware.rw_handles response.begin() [ 1806.037406] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1806.037406] env[68906]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1806.037406] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1806.037406] env[68906]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1806.037406] env[68906]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1806.037406] env[68906]: ERROR oslo_vmware.rw_handles [ 1806.038140] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Downloaded image file data b1400c31-d33b-4e13-944f-4c645e62493e to vmware_temp/5b64a95f-47e8-4c41-bd9f-b3db1571dc42/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1806.039708] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Caching image {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1806.039957] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Copying Virtual Disk [datastore2] vmware_temp/5b64a95f-47e8-4c41-bd9f-b3db1571dc42/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk to [datastore2] vmware_temp/5b64a95f-47e8-4c41-bd9f-b3db1571dc42/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk {{(pid=68906) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1806.040341] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-24b61589-673b-4e5e-a739-6030cb24b8ab {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1806.048056] env[68906]: DEBUG oslo_vmware.api [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Waiting for the task: (returnval){ [ 1806.048056] env[68906]: value = "task-3475432" [ 1806.048056] env[68906]: _type = "Task" [ 1806.048056] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1806.056068] env[68906]: DEBUG oslo_vmware.api [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Task: {'id': task-3475432, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1806.562029] env[68906]: DEBUG oslo_vmware.exceptions [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Fault InvalidArgument not matched. {{(pid=68906) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1806.562029] env[68906]: DEBUG oslo_concurrency.lockutils [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1806.562766] env[68906]: ERROR nova.compute.manager [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1806.562766] env[68906]: Faults: ['InvalidArgument'] [ 1806.562766] env[68906]: ERROR nova.compute.manager [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Traceback (most recent call last): [ 1806.562766] env[68906]: ERROR nova.compute.manager [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1806.562766] env[68906]: ERROR nova.compute.manager [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] yield resources [ 1806.562766] env[68906]: ERROR nova.compute.manager [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1806.562766] env[68906]: ERROR nova.compute.manager [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] self.driver.spawn(context, instance, image_meta, [ 1806.562766] env[68906]: ERROR nova.compute.manager [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1806.562766] env[68906]: ERROR nova.compute.manager [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1806.562766] env[68906]: ERROR nova.compute.manager [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1806.562766] env[68906]: ERROR nova.compute.manager [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] self._fetch_image_if_missing(context, vi) [ 1806.562766] env[68906]: ERROR nova.compute.manager [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1806.563448] env[68906]: ERROR nova.compute.manager [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] image_cache(vi, tmp_image_ds_loc) [ 1806.563448] env[68906]: ERROR nova.compute.manager [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1806.563448] env[68906]: ERROR nova.compute.manager [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] vm_util.copy_virtual_disk( [ 1806.563448] env[68906]: ERROR nova.compute.manager [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1806.563448] env[68906]: ERROR nova.compute.manager [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] session._wait_for_task(vmdk_copy_task) [ 1806.563448] env[68906]: ERROR nova.compute.manager [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1806.563448] env[68906]: ERROR nova.compute.manager [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] return self.wait_for_task(task_ref) [ 1806.563448] env[68906]: ERROR nova.compute.manager [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1806.563448] env[68906]: ERROR nova.compute.manager [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] return evt.wait() [ 1806.563448] env[68906]: ERROR nova.compute.manager [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1806.563448] env[68906]: ERROR nova.compute.manager [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] result = hub.switch() [ 1806.563448] env[68906]: ERROR nova.compute.manager [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1806.563448] env[68906]: ERROR nova.compute.manager [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] return self.greenlet.switch() [ 1806.564072] env[68906]: ERROR nova.compute.manager [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1806.564072] env[68906]: ERROR nova.compute.manager [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] self.f(*self.args, **self.kw) [ 1806.564072] env[68906]: ERROR nova.compute.manager [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1806.564072] env[68906]: ERROR nova.compute.manager [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] raise exceptions.translate_fault(task_info.error) [ 1806.564072] env[68906]: ERROR nova.compute.manager [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1806.564072] env[68906]: ERROR nova.compute.manager [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Faults: ['InvalidArgument'] [ 1806.564072] env[68906]: ERROR nova.compute.manager [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] [ 1806.564072] env[68906]: INFO nova.compute.manager [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Terminating instance [ 1806.565375] env[68906]: DEBUG oslo_concurrency.lockutils [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1806.565621] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1806.566012] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-10be52f7-ad37-4f2f-96f6-89e6a86ed09e {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1806.568177] env[68906]: DEBUG nova.compute.manager [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1806.568409] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1806.569194] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-04e43701-6f55-4864-9b17-8c9b60e51256 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1806.576274] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Unregistering the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1806.576537] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-db0bc8a7-5d99-4882-ab42-4b54a62a773a {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1806.578789] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1806.578998] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68906) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1806.579999] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-257a1816-875b-4b72-b310-e8ce06986fa7 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1806.584707] env[68906]: DEBUG oslo_vmware.api [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Waiting for the task: (returnval){ [ 1806.584707] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]526d5494-5a0f-d781-c94b-aa430005f995" [ 1806.584707] env[68906]: _type = "Task" [ 1806.584707] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1806.591504] env[68906]: DEBUG oslo_vmware.api [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]526d5494-5a0f-d781-c94b-aa430005f995, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1806.648172] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Unregistered the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1806.648461] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Deleting contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1806.648726] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Deleting the datastore file [datastore2] 7466df8a-59a9-49b9-bff7-c4efbeae3eee {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1806.649039] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-c32f81f1-a1f5-4959-9568-a563f7325666 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1806.655234] env[68906]: DEBUG oslo_vmware.api [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Waiting for the task: (returnval){ [ 1806.655234] env[68906]: value = "task-3475434" [ 1806.655234] env[68906]: _type = "Task" [ 1806.655234] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1806.662568] env[68906]: DEBUG oslo_vmware.api [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Task: {'id': task-3475434, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1807.095502] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: 89171680-c76d-4826-9236-379542661ffb] Preparing fetch location {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1807.095787] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Creating directory with path [datastore2] vmware_temp/8ccf706c-92fb-44db-a1d2-5c07abb76020/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1807.096010] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-187164b5-ccd5-4cda-a49d-7bdb6f59e253 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1807.108586] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Created directory with path [datastore2] vmware_temp/8ccf706c-92fb-44db-a1d2-5c07abb76020/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1807.108787] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: 89171680-c76d-4826-9236-379542661ffb] Fetch image to [datastore2] vmware_temp/8ccf706c-92fb-44db-a1d2-5c07abb76020/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1807.109039] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: 89171680-c76d-4826-9236-379542661ffb] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to [datastore2] vmware_temp/8ccf706c-92fb-44db-a1d2-5c07abb76020/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1807.109715] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e10fa439-222f-4810-878e-c77dbea5709b {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1807.116292] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b0d26f5a-4425-4b6c-ba4a-916ce6ac8b2d {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1807.125018] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2cc7a3b6-b11b-4f0b-b449-424c6fc8501d {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1807.155491] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ee2a94d3-4c6b-4fb1-ae88-dfdd239a5dd8 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1807.165667] env[68906]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-1e2bc85d-400c-4985-9a2e-80ec3ca94bf2 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1807.167250] env[68906]: DEBUG oslo_vmware.api [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Task: {'id': task-3475434, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.079472} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1807.167483] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Deleted the datastore file {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1807.167721] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Deleted contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1807.167899] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1807.168085] env[68906]: INFO nova.compute.manager [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1807.170696] env[68906]: DEBUG nova.compute.claims [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Aborting claim: {{(pid=68906) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1807.170859] env[68906]: DEBUG oslo_concurrency.lockutils [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1807.171087] env[68906]: DEBUG oslo_concurrency.lockutils [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1807.187911] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: 89171680-c76d-4826-9236-379542661ffb] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1807.237819] env[68906]: DEBUG oslo_vmware.rw_handles [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/8ccf706c-92fb-44db-a1d2-5c07abb76020/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1807.296697] env[68906]: DEBUG oslo_vmware.rw_handles [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Completed reading data from the image iterator. {{(pid=68906) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1807.296877] env[68906]: DEBUG oslo_vmware.rw_handles [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/8ccf706c-92fb-44db-a1d2-5c07abb76020/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1807.432891] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d407f01-c0ec-48d5-8f06-bc03a8abc910 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1807.440628] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d54b36ff-5e5e-4f80-bfe3-881a1b2ae0bd {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1807.469537] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16661693-30ed-4895-931b-cd74c5ca4ecc {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1807.476598] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9211c48e-ac88-4272-8d61-fbd2c38e3dea {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1807.490428] env[68906]: DEBUG nova.compute.provider_tree [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1807.498370] env[68906]: DEBUG nova.scheduler.client.report [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1807.511295] env[68906]: DEBUG oslo_concurrency.lockutils [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.340s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1807.511824] env[68906]: ERROR nova.compute.manager [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1807.511824] env[68906]: Faults: ['InvalidArgument'] [ 1807.511824] env[68906]: ERROR nova.compute.manager [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Traceback (most recent call last): [ 1807.511824] env[68906]: ERROR nova.compute.manager [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1807.511824] env[68906]: ERROR nova.compute.manager [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] self.driver.spawn(context, instance, image_meta, [ 1807.511824] env[68906]: ERROR nova.compute.manager [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1807.511824] env[68906]: ERROR nova.compute.manager [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1807.511824] env[68906]: ERROR nova.compute.manager [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1807.511824] env[68906]: ERROR nova.compute.manager [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] self._fetch_image_if_missing(context, vi) [ 1807.511824] env[68906]: ERROR nova.compute.manager [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1807.511824] env[68906]: ERROR nova.compute.manager [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] image_cache(vi, tmp_image_ds_loc) [ 1807.511824] env[68906]: ERROR nova.compute.manager [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1807.512294] env[68906]: ERROR nova.compute.manager [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] vm_util.copy_virtual_disk( [ 1807.512294] env[68906]: ERROR nova.compute.manager [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1807.512294] env[68906]: ERROR nova.compute.manager [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] session._wait_for_task(vmdk_copy_task) [ 1807.512294] env[68906]: ERROR nova.compute.manager [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1807.512294] env[68906]: ERROR nova.compute.manager [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] return self.wait_for_task(task_ref) [ 1807.512294] env[68906]: ERROR nova.compute.manager [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1807.512294] env[68906]: ERROR nova.compute.manager [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] return evt.wait() [ 1807.512294] env[68906]: ERROR nova.compute.manager [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1807.512294] env[68906]: ERROR nova.compute.manager [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] result = hub.switch() [ 1807.512294] env[68906]: ERROR nova.compute.manager [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1807.512294] env[68906]: ERROR nova.compute.manager [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] return self.greenlet.switch() [ 1807.512294] env[68906]: ERROR nova.compute.manager [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1807.512294] env[68906]: ERROR nova.compute.manager [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] self.f(*self.args, **self.kw) [ 1807.512730] env[68906]: ERROR nova.compute.manager [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1807.512730] env[68906]: ERROR nova.compute.manager [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] raise exceptions.translate_fault(task_info.error) [ 1807.512730] env[68906]: ERROR nova.compute.manager [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1807.512730] env[68906]: ERROR nova.compute.manager [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Faults: ['InvalidArgument'] [ 1807.512730] env[68906]: ERROR nova.compute.manager [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] [ 1807.512730] env[68906]: DEBUG nova.compute.utils [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] VimFaultException {{(pid=68906) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1807.513884] env[68906]: DEBUG nova.compute.manager [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Build of instance 7466df8a-59a9-49b9-bff7-c4efbeae3eee was re-scheduled: A specified parameter was not correct: fileType [ 1807.513884] env[68906]: Faults: ['InvalidArgument'] {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1807.514275] env[68906]: DEBUG nova.compute.manager [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Unplugging VIFs for instance {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1807.514448] env[68906]: DEBUG nova.compute.manager [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1807.514617] env[68906]: DEBUG nova.compute.manager [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1807.514779] env[68906]: DEBUG nova.network.neutron [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1807.808455] env[68906]: DEBUG nova.network.neutron [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1807.818668] env[68906]: INFO nova.compute.manager [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Took 0.30 seconds to deallocate network for instance. [ 1807.905958] env[68906]: INFO nova.scheduler.client.report [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Deleted allocations for instance 7466df8a-59a9-49b9-bff7-c4efbeae3eee [ 1807.926270] env[68906]: DEBUG oslo_concurrency.lockutils [None req-5dfdbe4f-bf7e-46a4-9601-537f14957a5d tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Lock "7466df8a-59a9-49b9-bff7-c4efbeae3eee" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 673.312s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1807.927777] env[68906]: DEBUG oslo_concurrency.lockutils [None req-d161b1c7-2d71-466f-b59f-1fcc08f7ea0b tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Lock "7466df8a-59a9-49b9-bff7-c4efbeae3eee" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 477.069s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1807.928122] env[68906]: DEBUG oslo_concurrency.lockutils [None req-d161b1c7-2d71-466f-b59f-1fcc08f7ea0b tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Acquiring lock "7466df8a-59a9-49b9-bff7-c4efbeae3eee-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1807.928392] env[68906]: DEBUG oslo_concurrency.lockutils [None req-d161b1c7-2d71-466f-b59f-1fcc08f7ea0b tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Lock "7466df8a-59a9-49b9-bff7-c4efbeae3eee-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1807.928575] env[68906]: DEBUG oslo_concurrency.lockutils [None req-d161b1c7-2d71-466f-b59f-1fcc08f7ea0b tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Lock "7466df8a-59a9-49b9-bff7-c4efbeae3eee-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1807.930360] env[68906]: INFO nova.compute.manager [None req-d161b1c7-2d71-466f-b59f-1fcc08f7ea0b tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Terminating instance [ 1807.932323] env[68906]: DEBUG nova.compute.manager [None req-d161b1c7-2d71-466f-b59f-1fcc08f7ea0b tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1807.932635] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-d161b1c7-2d71-466f-b59f-1fcc08f7ea0b tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1807.932775] env[68906]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-cb4da5f3-2524-4456-ae9f-1786d12e5dd3 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1807.937686] env[68906]: DEBUG nova.compute.manager [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1807.947169] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-55a8d0dc-7cb0-4b64-ab20-f10b17cd4808 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1807.980777] env[68906]: WARNING nova.virt.vmwareapi.vmops [None req-d161b1c7-2d71-466f-b59f-1fcc08f7ea0b tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 7466df8a-59a9-49b9-bff7-c4efbeae3eee could not be found. [ 1807.980985] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-d161b1c7-2d71-466f-b59f-1fcc08f7ea0b tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1807.981183] env[68906]: INFO nova.compute.manager [None req-d161b1c7-2d71-466f-b59f-1fcc08f7ea0b tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1807.981435] env[68906]: DEBUG oslo.service.loopingcall [None req-d161b1c7-2d71-466f-b59f-1fcc08f7ea0b tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1807.985850] env[68906]: DEBUG nova.compute.manager [-] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1807.985961] env[68906]: DEBUG nova.network.neutron [-] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1807.997751] env[68906]: DEBUG oslo_concurrency.lockutils [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1807.997982] env[68906]: DEBUG oslo_concurrency.lockutils [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1807.999372] env[68906]: INFO nova.compute.claims [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1808.017396] env[68906]: DEBUG nova.network.neutron [-] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1808.034111] env[68906]: INFO nova.compute.manager [-] [instance: 7466df8a-59a9-49b9-bff7-c4efbeae3eee] Took 0.05 seconds to deallocate network for instance. [ 1808.155069] env[68906]: DEBUG oslo_concurrency.lockutils [None req-d161b1c7-2d71-466f-b59f-1fcc08f7ea0b tempest-ServerGroupTestJSON-848006695 tempest-ServerGroupTestJSON-848006695-project-member] Lock "7466df8a-59a9-49b9-bff7-c4efbeae3eee" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.227s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1808.243012] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c35c1a73-ee69-4a6b-9948-1aa738c648a9 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1808.250631] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d49691c-f0bc-4f66-89e8-725e9830d156 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1808.280031] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ad761e6-c0b6-4943-b704-9e24cd0f918d {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1808.287046] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-121b03ea-4b88-46fb-8e60-6ca5a26636f2 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1808.299861] env[68906]: DEBUG nova.compute.provider_tree [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1808.309689] env[68906]: DEBUG nova.scheduler.client.report [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1808.323085] env[68906]: DEBUG oslo_concurrency.lockutils [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.325s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1808.323539] env[68906]: DEBUG nova.compute.manager [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Start building networks asynchronously for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1808.358068] env[68906]: DEBUG nova.compute.utils [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Using /dev/sd instead of None {{(pid=68906) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1808.358985] env[68906]: DEBUG nova.compute.manager [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Allocating IP information in the background. {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1808.359176] env[68906]: DEBUG nova.network.neutron [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] allocate_for_instance() {{(pid=68906) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1808.368588] env[68906]: DEBUG nova.compute.manager [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Start building block device mappings for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1808.428947] env[68906]: DEBUG nova.policy [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '58bee9a44ac942a287d360f281e25f02', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '41d6ceb682de4f6088d3b84b57ae1101', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68906) authorize /opt/stack/nova/nova/policy.py:203}} [ 1808.451303] env[68906]: DEBUG nova.compute.manager [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Start spawning the instance on the hypervisor. {{(pid=68906) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1808.478684] env[68906]: DEBUG nova.virt.hardware [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T13:00:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T13:00:23Z,direct_url=,disk_format='vmdk',id=b1400c31-d33b-4e13-944f-4c645e62493e,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='1ae7bf3a375d41c6af5e7536af51ffd1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T13:00:24Z,virtual_size=,visibility=), allow threads: False {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1808.478929] env[68906]: DEBUG nova.virt.hardware [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Flavor limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1808.479115] env[68906]: DEBUG nova.virt.hardware [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Image limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1808.479308] env[68906]: DEBUG nova.virt.hardware [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Flavor pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1808.479464] env[68906]: DEBUG nova.virt.hardware [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Image pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1808.479602] env[68906]: DEBUG nova.virt.hardware [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1808.479808] env[68906]: DEBUG nova.virt.hardware [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1808.479966] env[68906]: DEBUG nova.virt.hardware [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1808.480208] env[68906]: DEBUG nova.virt.hardware [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Got 1 possible topologies {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1808.480395] env[68906]: DEBUG nova.virt.hardware [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1808.480578] env[68906]: DEBUG nova.virt.hardware [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1808.481495] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78530054-df20-4d87-ab71-dcd870d2dc4d {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1808.489239] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e899974-97b0-40d6-9558-e45e438fde7e {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1808.718017] env[68906]: DEBUG nova.network.neutron [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Successfully created port: 63970843-cac4-4cec-b75d-a7138b425497 {{(pid=68906) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1809.251819] env[68906]: DEBUG nova.network.neutron [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Successfully updated port: 63970843-cac4-4cec-b75d-a7138b425497 {{(pid=68906) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1809.285456] env[68906]: DEBUG oslo_concurrency.lockutils [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Acquiring lock "refresh_cache-860248ea-e77b-4ff6-af64-b75f88a31348" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1809.285750] env[68906]: DEBUG oslo_concurrency.lockutils [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Acquired lock "refresh_cache-860248ea-e77b-4ff6-af64-b75f88a31348" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1809.285750] env[68906]: DEBUG nova.network.neutron [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Building network info cache for instance {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1809.321942] env[68906]: DEBUG nova.network.neutron [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Instance cache missing network info. {{(pid=68906) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1809.493803] env[68906]: DEBUG nova.network.neutron [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Updating instance_info_cache with network_info: [{"id": "63970843-cac4-4cec-b75d-a7138b425497", "address": "fa:16:3e:c9:c7:e1", "network": {"id": "8895f8be-c1f8-4a8b-8708-2bc0a03c5b67", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-449319980-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "41d6ceb682de4f6088d3b84b57ae1101", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a58387dd-f438-4913-af6a-fafb734cd881", "external-id": "nsx-vlan-transportzone-169", "segmentation_id": 169, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap63970843-ca", "ovs_interfaceid": "63970843-cac4-4cec-b75d-a7138b425497", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1809.508110] env[68906]: DEBUG oslo_concurrency.lockutils [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Releasing lock "refresh_cache-860248ea-e77b-4ff6-af64-b75f88a31348" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1809.508444] env[68906]: DEBUG nova.compute.manager [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Instance network_info: |[{"id": "63970843-cac4-4cec-b75d-a7138b425497", "address": "fa:16:3e:c9:c7:e1", "network": {"id": "8895f8be-c1f8-4a8b-8708-2bc0a03c5b67", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-449319980-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "41d6ceb682de4f6088d3b84b57ae1101", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a58387dd-f438-4913-af6a-fafb734cd881", "external-id": "nsx-vlan-transportzone-169", "segmentation_id": 169, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap63970843-ca", "ovs_interfaceid": "63970843-cac4-4cec-b75d-a7138b425497", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1809.508832] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:c9:c7:e1', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'a58387dd-f438-4913-af6a-fafb734cd881', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '63970843-cac4-4cec-b75d-a7138b425497', 'vif_model': 'vmxnet3'}] {{(pid=68906) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1809.516294] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Creating folder: Project (41d6ceb682de4f6088d3b84b57ae1101). Parent ref: group-v694750. {{(pid=68906) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1809.516880] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-fc7b66a1-313b-4a0c-800c-588d5c44215e {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1809.528682] env[68906]: INFO nova.virt.vmwareapi.vm_util [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Created folder: Project (41d6ceb682de4f6088d3b84b57ae1101) in parent group-v694750. [ 1809.528861] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Creating folder: Instances. Parent ref: group-v694847. {{(pid=68906) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1809.529098] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e45df75a-ea93-44fb-98a7-c6337f121808 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1809.537327] env[68906]: INFO nova.virt.vmwareapi.vm_util [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Created folder: Instances in parent group-v694847. [ 1809.537753] env[68906]: DEBUG oslo.service.loopingcall [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1809.537753] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Creating VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1809.537907] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-d87aede3-9a0e-4358-906d-59d8eb5d28b0 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1809.556504] env[68906]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1809.556504] env[68906]: value = "task-3475437" [ 1809.556504] env[68906]: _type = "Task" [ 1809.556504] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1809.563561] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475437, 'name': CreateVM_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1809.979070] env[68906]: DEBUG nova.compute.manager [req-2ae8772a-c9b1-4ced-a86b-2b13ec0b34b0 req-5ccfceb9-c158-4d84-a12e-31f9d47e0aeb service nova] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Received event network-vif-plugged-63970843-cac4-4cec-b75d-a7138b425497 {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1809.979691] env[68906]: DEBUG oslo_concurrency.lockutils [req-2ae8772a-c9b1-4ced-a86b-2b13ec0b34b0 req-5ccfceb9-c158-4d84-a12e-31f9d47e0aeb service nova] Acquiring lock "860248ea-e77b-4ff6-af64-b75f88a31348-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1809.979924] env[68906]: DEBUG oslo_concurrency.lockutils [req-2ae8772a-c9b1-4ced-a86b-2b13ec0b34b0 req-5ccfceb9-c158-4d84-a12e-31f9d47e0aeb service nova] Lock "860248ea-e77b-4ff6-af64-b75f88a31348-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1809.980118] env[68906]: DEBUG oslo_concurrency.lockutils [req-2ae8772a-c9b1-4ced-a86b-2b13ec0b34b0 req-5ccfceb9-c158-4d84-a12e-31f9d47e0aeb service nova] Lock "860248ea-e77b-4ff6-af64-b75f88a31348-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1809.980320] env[68906]: DEBUG nova.compute.manager [req-2ae8772a-c9b1-4ced-a86b-2b13ec0b34b0 req-5ccfceb9-c158-4d84-a12e-31f9d47e0aeb service nova] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] No waiting events found dispatching network-vif-plugged-63970843-cac4-4cec-b75d-a7138b425497 {{(pid=68906) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1809.980498] env[68906]: WARNING nova.compute.manager [req-2ae8772a-c9b1-4ced-a86b-2b13ec0b34b0 req-5ccfceb9-c158-4d84-a12e-31f9d47e0aeb service nova] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Received unexpected event network-vif-plugged-63970843-cac4-4cec-b75d-a7138b425497 for instance with vm_state building and task_state spawning. [ 1809.980664] env[68906]: DEBUG nova.compute.manager [req-2ae8772a-c9b1-4ced-a86b-2b13ec0b34b0 req-5ccfceb9-c158-4d84-a12e-31f9d47e0aeb service nova] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Received event network-changed-63970843-cac4-4cec-b75d-a7138b425497 {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1809.980821] env[68906]: DEBUG nova.compute.manager [req-2ae8772a-c9b1-4ced-a86b-2b13ec0b34b0 req-5ccfceb9-c158-4d84-a12e-31f9d47e0aeb service nova] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Refreshing instance network info cache due to event network-changed-63970843-cac4-4cec-b75d-a7138b425497. {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1809.981017] env[68906]: DEBUG oslo_concurrency.lockutils [req-2ae8772a-c9b1-4ced-a86b-2b13ec0b34b0 req-5ccfceb9-c158-4d84-a12e-31f9d47e0aeb service nova] Acquiring lock "refresh_cache-860248ea-e77b-4ff6-af64-b75f88a31348" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1809.981166] env[68906]: DEBUG oslo_concurrency.lockutils [req-2ae8772a-c9b1-4ced-a86b-2b13ec0b34b0 req-5ccfceb9-c158-4d84-a12e-31f9d47e0aeb service nova] Acquired lock "refresh_cache-860248ea-e77b-4ff6-af64-b75f88a31348" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1809.981404] env[68906]: DEBUG nova.network.neutron [req-2ae8772a-c9b1-4ced-a86b-2b13ec0b34b0 req-5ccfceb9-c158-4d84-a12e-31f9d47e0aeb service nova] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Refreshing network info cache for port 63970843-cac4-4cec-b75d-a7138b425497 {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1810.067088] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475437, 'name': CreateVM_Task, 'duration_secs': 0.361334} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1810.067258] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Created VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1810.067908] env[68906]: DEBUG oslo_concurrency.lockutils [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1810.068087] env[68906]: DEBUG oslo_concurrency.lockutils [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1810.068417] env[68906]: DEBUG oslo_concurrency.lockutils [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1810.068657] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-84038684-9226-4acb-81f1-77b8a889374e {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1810.073254] env[68906]: DEBUG oslo_vmware.api [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Waiting for the task: (returnval){ [ 1810.073254] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]52612449-e8e3-ec0e-4d2e-e5a89bd91f70" [ 1810.073254] env[68906]: _type = "Task" [ 1810.073254] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1810.080563] env[68906]: DEBUG oslo_vmware.api [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]52612449-e8e3-ec0e-4d2e-e5a89bd91f70, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1810.219636] env[68906]: DEBUG nova.network.neutron [req-2ae8772a-c9b1-4ced-a86b-2b13ec0b34b0 req-5ccfceb9-c158-4d84-a12e-31f9d47e0aeb service nova] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Updated VIF entry in instance network info cache for port 63970843-cac4-4cec-b75d-a7138b425497. {{(pid=68906) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1810.219988] env[68906]: DEBUG nova.network.neutron [req-2ae8772a-c9b1-4ced-a86b-2b13ec0b34b0 req-5ccfceb9-c158-4d84-a12e-31f9d47e0aeb service nova] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Updating instance_info_cache with network_info: [{"id": "63970843-cac4-4cec-b75d-a7138b425497", "address": "fa:16:3e:c9:c7:e1", "network": {"id": "8895f8be-c1f8-4a8b-8708-2bc0a03c5b67", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-449319980-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "41d6ceb682de4f6088d3b84b57ae1101", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a58387dd-f438-4913-af6a-fafb734cd881", "external-id": "nsx-vlan-transportzone-169", "segmentation_id": 169, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap63970843-ca", "ovs_interfaceid": "63970843-cac4-4cec-b75d-a7138b425497", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1810.229020] env[68906]: DEBUG oslo_concurrency.lockutils [req-2ae8772a-c9b1-4ced-a86b-2b13ec0b34b0 req-5ccfceb9-c158-4d84-a12e-31f9d47e0aeb service nova] Releasing lock "refresh_cache-860248ea-e77b-4ff6-af64-b75f88a31348" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1810.583711] env[68906]: DEBUG oslo_concurrency.lockutils [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1810.584070] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Processing image b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1810.584187] env[68906]: DEBUG oslo_concurrency.lockutils [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1814.926688] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Acquiring lock "8bfc91d4-b1d7-449a-8d48-0e63490fe663" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1814.926688] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Lock "8bfc91d4-b1d7-449a-8d48-0e63490fe663" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1823.125390] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._sync_power_states {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1823.148233] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Getting list of instances from cluster (obj){ [ 1823.148233] env[68906]: value = "domain-c8" [ 1823.148233] env[68906]: _type = "ClusterComputeResource" [ 1823.148233] env[68906]: } {{(pid=68906) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 1823.149548] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-19c6e8ee-58f3-4cca-ae27-0df8066c1361 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1823.167951] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Got total of 10 instances {{(pid=68906) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 1823.168126] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Triggering sync for uuid 89171680-c76d-4826-9236-379542661ffb {{(pid=68906) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1823.168331] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Triggering sync for uuid 9b884416-df89-4d8c-b2ab-0667db52a718 {{(pid=68906) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1823.168501] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Triggering sync for uuid aed06616-d008-4695-b66e-9f40acf5ebd3 {{(pid=68906) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1823.168660] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Triggering sync for uuid 17327bc3-433e-4006-93c7-e53714ed70c2 {{(pid=68906) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1823.168817] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Triggering sync for uuid 32f5b54d-30bf-4fe9-9622-3ff74344b3f3 {{(pid=68906) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1823.168967] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Triggering sync for uuid 922d81ba-c8d2-43ba-b1c5-f2943418d6a2 {{(pid=68906) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1823.169132] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Triggering sync for uuid 736db39c-e5e5-4a54-b85a-aa5c703f432e {{(pid=68906) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1823.169280] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Triggering sync for uuid ce6e5cd6-efb8-46d1-811d-74c084661cce {{(pid=68906) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1823.169425] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Triggering sync for uuid 7994d291-b4bf-48f5-ad34-c1f484d77f6e {{(pid=68906) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1823.169569] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Triggering sync for uuid 860248ea-e77b-4ff6-af64-b75f88a31348 {{(pid=68906) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1823.169894] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "89171680-c76d-4826-9236-379542661ffb" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1823.170139] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "9b884416-df89-4d8c-b2ab-0667db52a718" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1823.170353] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "aed06616-d008-4695-b66e-9f40acf5ebd3" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1823.170548] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "17327bc3-433e-4006-93c7-e53714ed70c2" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1823.170956] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "32f5b54d-30bf-4fe9-9622-3ff74344b3f3" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1823.171212] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "922d81ba-c8d2-43ba-b1c5-f2943418d6a2" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1823.171423] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "736db39c-e5e5-4a54-b85a-aa5c703f432e" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1823.171624] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "ce6e5cd6-efb8-46d1-811d-74c084661cce" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1823.171849] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "7994d291-b4bf-48f5-ad34-c1f484d77f6e" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1823.172075] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "860248ea-e77b-4ff6-af64-b75f88a31348" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1827.577927] env[68906]: DEBUG oslo_concurrency.lockutils [None req-9f5c00ab-85af-4a9f-80ab-50bd6f212585 tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Acquiring lock "7994d291-b4bf-48f5-ad34-c1f484d77f6e" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1845.225205] env[68906]: DEBUG oslo_concurrency.lockutils [None req-4077b494-948d-40fb-ba1f-1a5eae01fbe0 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Acquiring lock "d70b039d-c8ad-4ffd-84f8-08f17cb97578" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1845.225534] env[68906]: DEBUG oslo_concurrency.lockutils [None req-4077b494-948d-40fb-ba1f-1a5eae01fbe0 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Lock "d70b039d-c8ad-4ffd-84f8-08f17cb97578" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1846.188172] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1848.140158] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1848.140442] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Starting heal instance info cache {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1848.140511] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Rebuilding the list of instances to heal {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1848.163415] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 89171680-c76d-4826-9236-379542661ffb] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1848.163619] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1848.163702] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1848.163828] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1848.163948] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1848.164082] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1848.164205] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1848.164327] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1848.164433] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1848.164549] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1848.164668] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Didn't find any instances for network info cache update. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1848.331815] env[68906]: DEBUG oslo_concurrency.lockutils [None req-400efce6-9f91-4c1f-9c17-29902c0c577a tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Acquiring lock "860248ea-e77b-4ff6-af64-b75f88a31348" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1850.140499] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1850.140880] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1853.141369] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1855.141632] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1855.141913] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68906) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1855.495169] env[68906]: WARNING oslo_vmware.rw_handles [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1855.495169] env[68906]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1855.495169] env[68906]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1855.495169] env[68906]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1855.495169] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1855.495169] env[68906]: ERROR oslo_vmware.rw_handles response.begin() [ 1855.495169] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1855.495169] env[68906]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1855.495169] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1855.495169] env[68906]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1855.495169] env[68906]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1855.495169] env[68906]: ERROR oslo_vmware.rw_handles [ 1855.495624] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: 89171680-c76d-4826-9236-379542661ffb] Downloaded image file data b1400c31-d33b-4e13-944f-4c645e62493e to vmware_temp/8ccf706c-92fb-44db-a1d2-5c07abb76020/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1855.497710] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: 89171680-c76d-4826-9236-379542661ffb] Caching image {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1855.497968] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Copying Virtual Disk [datastore2] vmware_temp/8ccf706c-92fb-44db-a1d2-5c07abb76020/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk to [datastore2] vmware_temp/8ccf706c-92fb-44db-a1d2-5c07abb76020/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk {{(pid=68906) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1855.498265] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-01fe02da-0b2b-40b7-be25-781f1137d781 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1855.507145] env[68906]: DEBUG oslo_vmware.api [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Waiting for the task: (returnval){ [ 1855.507145] env[68906]: value = "task-3475438" [ 1855.507145] env[68906]: _type = "Task" [ 1855.507145] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1855.514618] env[68906]: DEBUG oslo_vmware.api [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Task: {'id': task-3475438, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1856.018020] env[68906]: DEBUG oslo_vmware.exceptions [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Fault InvalidArgument not matched. {{(pid=68906) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1856.018345] env[68906]: DEBUG oslo_concurrency.lockutils [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1856.018933] env[68906]: ERROR nova.compute.manager [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: 89171680-c76d-4826-9236-379542661ffb] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1856.018933] env[68906]: Faults: ['InvalidArgument'] [ 1856.018933] env[68906]: ERROR nova.compute.manager [instance: 89171680-c76d-4826-9236-379542661ffb] Traceback (most recent call last): [ 1856.018933] env[68906]: ERROR nova.compute.manager [instance: 89171680-c76d-4826-9236-379542661ffb] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1856.018933] env[68906]: ERROR nova.compute.manager [instance: 89171680-c76d-4826-9236-379542661ffb] yield resources [ 1856.018933] env[68906]: ERROR nova.compute.manager [instance: 89171680-c76d-4826-9236-379542661ffb] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1856.018933] env[68906]: ERROR nova.compute.manager [instance: 89171680-c76d-4826-9236-379542661ffb] self.driver.spawn(context, instance, image_meta, [ 1856.018933] env[68906]: ERROR nova.compute.manager [instance: 89171680-c76d-4826-9236-379542661ffb] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1856.018933] env[68906]: ERROR nova.compute.manager [instance: 89171680-c76d-4826-9236-379542661ffb] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1856.018933] env[68906]: ERROR nova.compute.manager [instance: 89171680-c76d-4826-9236-379542661ffb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1856.018933] env[68906]: ERROR nova.compute.manager [instance: 89171680-c76d-4826-9236-379542661ffb] self._fetch_image_if_missing(context, vi) [ 1856.018933] env[68906]: ERROR nova.compute.manager [instance: 89171680-c76d-4826-9236-379542661ffb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1856.019436] env[68906]: ERROR nova.compute.manager [instance: 89171680-c76d-4826-9236-379542661ffb] image_cache(vi, tmp_image_ds_loc) [ 1856.019436] env[68906]: ERROR nova.compute.manager [instance: 89171680-c76d-4826-9236-379542661ffb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1856.019436] env[68906]: ERROR nova.compute.manager [instance: 89171680-c76d-4826-9236-379542661ffb] vm_util.copy_virtual_disk( [ 1856.019436] env[68906]: ERROR nova.compute.manager [instance: 89171680-c76d-4826-9236-379542661ffb] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1856.019436] env[68906]: ERROR nova.compute.manager [instance: 89171680-c76d-4826-9236-379542661ffb] session._wait_for_task(vmdk_copy_task) [ 1856.019436] env[68906]: ERROR nova.compute.manager [instance: 89171680-c76d-4826-9236-379542661ffb] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1856.019436] env[68906]: ERROR nova.compute.manager [instance: 89171680-c76d-4826-9236-379542661ffb] return self.wait_for_task(task_ref) [ 1856.019436] env[68906]: ERROR nova.compute.manager [instance: 89171680-c76d-4826-9236-379542661ffb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1856.019436] env[68906]: ERROR nova.compute.manager [instance: 89171680-c76d-4826-9236-379542661ffb] return evt.wait() [ 1856.019436] env[68906]: ERROR nova.compute.manager [instance: 89171680-c76d-4826-9236-379542661ffb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1856.019436] env[68906]: ERROR nova.compute.manager [instance: 89171680-c76d-4826-9236-379542661ffb] result = hub.switch() [ 1856.019436] env[68906]: ERROR nova.compute.manager [instance: 89171680-c76d-4826-9236-379542661ffb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1856.019436] env[68906]: ERROR nova.compute.manager [instance: 89171680-c76d-4826-9236-379542661ffb] return self.greenlet.switch() [ 1856.019818] env[68906]: ERROR nova.compute.manager [instance: 89171680-c76d-4826-9236-379542661ffb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1856.019818] env[68906]: ERROR nova.compute.manager [instance: 89171680-c76d-4826-9236-379542661ffb] self.f(*self.args, **self.kw) [ 1856.019818] env[68906]: ERROR nova.compute.manager [instance: 89171680-c76d-4826-9236-379542661ffb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1856.019818] env[68906]: ERROR nova.compute.manager [instance: 89171680-c76d-4826-9236-379542661ffb] raise exceptions.translate_fault(task_info.error) [ 1856.019818] env[68906]: ERROR nova.compute.manager [instance: 89171680-c76d-4826-9236-379542661ffb] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1856.019818] env[68906]: ERROR nova.compute.manager [instance: 89171680-c76d-4826-9236-379542661ffb] Faults: ['InvalidArgument'] [ 1856.019818] env[68906]: ERROR nova.compute.manager [instance: 89171680-c76d-4826-9236-379542661ffb] [ 1856.019818] env[68906]: INFO nova.compute.manager [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: 89171680-c76d-4826-9236-379542661ffb] Terminating instance [ 1856.020828] env[68906]: DEBUG oslo_concurrency.lockutils [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1856.021047] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1856.021323] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-8862fefc-c53c-4754-be48-2c617b7e7303 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1856.023584] env[68906]: DEBUG nova.compute.manager [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: 89171680-c76d-4826-9236-379542661ffb] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1856.023784] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: 89171680-c76d-4826-9236-379542661ffb] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1856.024570] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-31a571c5-20b1-430d-8f23-4fc6d3b99240 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1856.031420] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: 89171680-c76d-4826-9236-379542661ffb] Unregistering the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1856.031654] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-a6ab4af6-923d-4d78-8912-a0a63089202c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1856.033773] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1856.033947] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68906) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1856.034891] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-acfda88d-607c-4678-a614-7b552a6a18e0 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1856.039322] env[68906]: DEBUG oslo_vmware.api [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Waiting for the task: (returnval){ [ 1856.039322] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]52ab5ff8-c9dc-7877-0977-1fd0f7d640b4" [ 1856.039322] env[68906]: _type = "Task" [ 1856.039322] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1856.046406] env[68906]: DEBUG oslo_vmware.api [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]52ab5ff8-c9dc-7877-0977-1fd0f7d640b4, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1856.103057] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: 89171680-c76d-4826-9236-379542661ffb] Unregistered the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1856.103285] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: 89171680-c76d-4826-9236-379542661ffb] Deleting contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1856.103464] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Deleting the datastore file [datastore2] 89171680-c76d-4826-9236-379542661ffb {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1856.103729] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-cc962ba8-f57c-4528-9762-4c66a9fb850b {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1856.109392] env[68906]: DEBUG oslo_vmware.api [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Waiting for the task: (returnval){ [ 1856.109392] env[68906]: value = "task-3475440" [ 1856.109392] env[68906]: _type = "Task" [ 1856.109392] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1856.116971] env[68906]: DEBUG oslo_vmware.api [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Task: {'id': task-3475440, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1856.548993] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Preparing fetch location {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1856.549274] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Creating directory with path [datastore2] vmware_temp/a80c5472-2f66-41fa-af34-9b95f1444da8/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1856.549503] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-96da4bd2-cb7c-4af0-83bc-6ccbb564ea92 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1856.560199] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Created directory with path [datastore2] vmware_temp/a80c5472-2f66-41fa-af34-9b95f1444da8/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1856.560404] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Fetch image to [datastore2] vmware_temp/a80c5472-2f66-41fa-af34-9b95f1444da8/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1856.560553] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to [datastore2] vmware_temp/a80c5472-2f66-41fa-af34-9b95f1444da8/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1856.561303] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32647964-3594-4aa4-994d-4da5393b1eb3 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1856.567809] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-63307809-256f-492d-bc63-99d766cb2e25 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1856.576481] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-40aef7ee-c953-4db2-8b15-f0ebbc917c83 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1856.607535] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-228810b6-2d1f-4948-b1cd-04d1bacb4694 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1856.618468] env[68906]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-fc434dfe-c0d7-430b-b825-55639c65d94c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1856.620154] env[68906]: DEBUG oslo_vmware.api [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Task: {'id': task-3475440, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.092177} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1856.620398] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Deleted the datastore file {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1856.620573] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: 89171680-c76d-4826-9236-379542661ffb] Deleted contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1856.620744] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: 89171680-c76d-4826-9236-379542661ffb] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1856.620917] env[68906]: INFO nova.compute.manager [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: 89171680-c76d-4826-9236-379542661ffb] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1856.623049] env[68906]: DEBUG nova.compute.claims [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: 89171680-c76d-4826-9236-379542661ffb] Aborting claim: {{(pid=68906) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1856.623218] env[68906]: DEBUG oslo_concurrency.lockutils [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1856.623433] env[68906]: DEBUG oslo_concurrency.lockutils [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1856.641367] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1856.798398] env[68906]: DEBUG oslo_vmware.rw_handles [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a80c5472-2f66-41fa-af34-9b95f1444da8/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1856.857246] env[68906]: DEBUG oslo_vmware.rw_handles [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Completed reading data from the image iterator. {{(pid=68906) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1856.857433] env[68906]: DEBUG oslo_vmware.rw_handles [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a80c5472-2f66-41fa-af34-9b95f1444da8/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1856.905045] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-40f1605b-c74c-47ef-8da8-a8d333fe3403 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1856.912888] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dc71ecf7-fee4-4c45-9d57-e6db789d6813 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1856.942750] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ad272a16-37c7-4263-9003-a49389c1b6af {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1856.949810] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e59e6d13-bc65-4d32-b069-91e17443928e {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1856.963650] env[68906]: DEBUG nova.compute.provider_tree [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1856.972956] env[68906]: DEBUG nova.scheduler.client.report [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1856.986926] env[68906]: DEBUG oslo_concurrency.lockutils [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.363s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1856.987539] env[68906]: ERROR nova.compute.manager [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: 89171680-c76d-4826-9236-379542661ffb] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1856.987539] env[68906]: Faults: ['InvalidArgument'] [ 1856.987539] env[68906]: ERROR nova.compute.manager [instance: 89171680-c76d-4826-9236-379542661ffb] Traceback (most recent call last): [ 1856.987539] env[68906]: ERROR nova.compute.manager [instance: 89171680-c76d-4826-9236-379542661ffb] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1856.987539] env[68906]: ERROR nova.compute.manager [instance: 89171680-c76d-4826-9236-379542661ffb] self.driver.spawn(context, instance, image_meta, [ 1856.987539] env[68906]: ERROR nova.compute.manager [instance: 89171680-c76d-4826-9236-379542661ffb] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1856.987539] env[68906]: ERROR nova.compute.manager [instance: 89171680-c76d-4826-9236-379542661ffb] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1856.987539] env[68906]: ERROR nova.compute.manager [instance: 89171680-c76d-4826-9236-379542661ffb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1856.987539] env[68906]: ERROR nova.compute.manager [instance: 89171680-c76d-4826-9236-379542661ffb] self._fetch_image_if_missing(context, vi) [ 1856.987539] env[68906]: ERROR nova.compute.manager [instance: 89171680-c76d-4826-9236-379542661ffb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1856.987539] env[68906]: ERROR nova.compute.manager [instance: 89171680-c76d-4826-9236-379542661ffb] image_cache(vi, tmp_image_ds_loc) [ 1856.987539] env[68906]: ERROR nova.compute.manager [instance: 89171680-c76d-4826-9236-379542661ffb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1856.987935] env[68906]: ERROR nova.compute.manager [instance: 89171680-c76d-4826-9236-379542661ffb] vm_util.copy_virtual_disk( [ 1856.987935] env[68906]: ERROR nova.compute.manager [instance: 89171680-c76d-4826-9236-379542661ffb] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1856.987935] env[68906]: ERROR nova.compute.manager [instance: 89171680-c76d-4826-9236-379542661ffb] session._wait_for_task(vmdk_copy_task) [ 1856.987935] env[68906]: ERROR nova.compute.manager [instance: 89171680-c76d-4826-9236-379542661ffb] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1856.987935] env[68906]: ERROR nova.compute.manager [instance: 89171680-c76d-4826-9236-379542661ffb] return self.wait_for_task(task_ref) [ 1856.987935] env[68906]: ERROR nova.compute.manager [instance: 89171680-c76d-4826-9236-379542661ffb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1856.987935] env[68906]: ERROR nova.compute.manager [instance: 89171680-c76d-4826-9236-379542661ffb] return evt.wait() [ 1856.987935] env[68906]: ERROR nova.compute.manager [instance: 89171680-c76d-4826-9236-379542661ffb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1856.987935] env[68906]: ERROR nova.compute.manager [instance: 89171680-c76d-4826-9236-379542661ffb] result = hub.switch() [ 1856.987935] env[68906]: ERROR nova.compute.manager [instance: 89171680-c76d-4826-9236-379542661ffb] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1856.987935] env[68906]: ERROR nova.compute.manager [instance: 89171680-c76d-4826-9236-379542661ffb] return self.greenlet.switch() [ 1856.987935] env[68906]: ERROR nova.compute.manager [instance: 89171680-c76d-4826-9236-379542661ffb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1856.987935] env[68906]: ERROR nova.compute.manager [instance: 89171680-c76d-4826-9236-379542661ffb] self.f(*self.args, **self.kw) [ 1856.988508] env[68906]: ERROR nova.compute.manager [instance: 89171680-c76d-4826-9236-379542661ffb] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1856.988508] env[68906]: ERROR nova.compute.manager [instance: 89171680-c76d-4826-9236-379542661ffb] raise exceptions.translate_fault(task_info.error) [ 1856.988508] env[68906]: ERROR nova.compute.manager [instance: 89171680-c76d-4826-9236-379542661ffb] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1856.988508] env[68906]: ERROR nova.compute.manager [instance: 89171680-c76d-4826-9236-379542661ffb] Faults: ['InvalidArgument'] [ 1856.988508] env[68906]: ERROR nova.compute.manager [instance: 89171680-c76d-4826-9236-379542661ffb] [ 1856.988508] env[68906]: DEBUG nova.compute.utils [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: 89171680-c76d-4826-9236-379542661ffb] VimFaultException {{(pid=68906) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1856.989568] env[68906]: DEBUG nova.compute.manager [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: 89171680-c76d-4826-9236-379542661ffb] Build of instance 89171680-c76d-4826-9236-379542661ffb was re-scheduled: A specified parameter was not correct: fileType [ 1856.989568] env[68906]: Faults: ['InvalidArgument'] {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1856.989938] env[68906]: DEBUG nova.compute.manager [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: 89171680-c76d-4826-9236-379542661ffb] Unplugging VIFs for instance {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1856.990122] env[68906]: DEBUG nova.compute.manager [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1856.990292] env[68906]: DEBUG nova.compute.manager [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: 89171680-c76d-4826-9236-379542661ffb] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1856.990452] env[68906]: DEBUG nova.network.neutron [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: 89171680-c76d-4826-9236-379542661ffb] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1857.141060] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1857.293764] env[68906]: DEBUG nova.network.neutron [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: 89171680-c76d-4826-9236-379542661ffb] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1857.306956] env[68906]: INFO nova.compute.manager [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: 89171680-c76d-4826-9236-379542661ffb] Took 0.32 seconds to deallocate network for instance. [ 1857.426721] env[68906]: INFO nova.scheduler.client.report [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Deleted allocations for instance 89171680-c76d-4826-9236-379542661ffb [ 1857.460712] env[68906]: DEBUG oslo_concurrency.lockutils [None req-405761bb-a2e6-43b7-ab88-ccfe847f6af7 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Lock "89171680-c76d-4826-9236-379542661ffb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 672.719s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1857.462104] env[68906]: DEBUG oslo_concurrency.lockutils [None req-2be6fa1d-6d86-4f82-877d-7c51f7330fb4 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Lock "89171680-c76d-4826-9236-379542661ffb" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 476.897s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1857.462326] env[68906]: DEBUG oslo_concurrency.lockutils [None req-2be6fa1d-6d86-4f82-877d-7c51f7330fb4 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Acquiring lock "89171680-c76d-4826-9236-379542661ffb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1857.462531] env[68906]: DEBUG oslo_concurrency.lockutils [None req-2be6fa1d-6d86-4f82-877d-7c51f7330fb4 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Lock "89171680-c76d-4826-9236-379542661ffb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1857.462734] env[68906]: DEBUG oslo_concurrency.lockutils [None req-2be6fa1d-6d86-4f82-877d-7c51f7330fb4 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Lock "89171680-c76d-4826-9236-379542661ffb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1857.464697] env[68906]: INFO nova.compute.manager [None req-2be6fa1d-6d86-4f82-877d-7c51f7330fb4 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: 89171680-c76d-4826-9236-379542661ffb] Terminating instance [ 1857.466367] env[68906]: DEBUG nova.compute.manager [None req-2be6fa1d-6d86-4f82-877d-7c51f7330fb4 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: 89171680-c76d-4826-9236-379542661ffb] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1857.466556] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-2be6fa1d-6d86-4f82-877d-7c51f7330fb4 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: 89171680-c76d-4826-9236-379542661ffb] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1857.467099] env[68906]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-011b6c20-367a-4160-a671-6eee3a0e34e8 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1857.472030] env[68906]: DEBUG nova.compute.manager [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1857.478161] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8fc608ed-57bd-4d8a-9f62-8e2366740a2e {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1857.508100] env[68906]: WARNING nova.virt.vmwareapi.vmops [None req-2be6fa1d-6d86-4f82-877d-7c51f7330fb4 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: 89171680-c76d-4826-9236-379542661ffb] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 89171680-c76d-4826-9236-379542661ffb could not be found. [ 1857.508287] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-2be6fa1d-6d86-4f82-877d-7c51f7330fb4 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: 89171680-c76d-4826-9236-379542661ffb] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1857.508467] env[68906]: INFO nova.compute.manager [None req-2be6fa1d-6d86-4f82-877d-7c51f7330fb4 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: 89171680-c76d-4826-9236-379542661ffb] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1857.508707] env[68906]: DEBUG oslo.service.loopingcall [None req-2be6fa1d-6d86-4f82-877d-7c51f7330fb4 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1857.510916] env[68906]: DEBUG nova.compute.manager [-] [instance: 89171680-c76d-4826-9236-379542661ffb] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1857.511042] env[68906]: DEBUG nova.network.neutron [-] [instance: 89171680-c76d-4826-9236-379542661ffb] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1857.526578] env[68906]: DEBUG oslo_concurrency.lockutils [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1857.526838] env[68906]: DEBUG oslo_concurrency.lockutils [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1857.528309] env[68906]: INFO nova.compute.claims [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1857.537736] env[68906]: DEBUG nova.network.neutron [-] [instance: 89171680-c76d-4826-9236-379542661ffb] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1857.547908] env[68906]: INFO nova.compute.manager [-] [instance: 89171680-c76d-4826-9236-379542661ffb] Took 0.04 seconds to deallocate network for instance. [ 1857.631328] env[68906]: DEBUG oslo_concurrency.lockutils [None req-2be6fa1d-6d86-4f82-877d-7c51f7330fb4 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Lock "89171680-c76d-4826-9236-379542661ffb" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.169s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1857.632252] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "89171680-c76d-4826-9236-379542661ffb" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 34.462s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1857.632451] env[68906]: INFO nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 89171680-c76d-4826-9236-379542661ffb] During sync_power_state the instance has a pending task (deleting). Skip. [ 1857.632678] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "89171680-c76d-4826-9236-379542661ffb" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1857.745352] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28907152-348e-4207-b2a4-fb93e5912134 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1857.754900] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8d6d4522-eb8d-4e42-8bc8-e1c005b913d8 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1857.784047] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d48ac448-9bf8-481e-9e32-1968eb0757f2 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1857.791790] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-90f9451f-dc4e-482b-93d9-f0e4fa0872b8 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1857.805218] env[68906]: DEBUG nova.compute.provider_tree [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1857.813919] env[68906]: DEBUG nova.scheduler.client.report [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1857.828702] env[68906]: DEBUG oslo_concurrency.lockutils [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.302s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1857.829185] env[68906]: DEBUG nova.compute.manager [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Start building networks asynchronously for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1857.862689] env[68906]: DEBUG nova.compute.utils [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Using /dev/sd instead of None {{(pid=68906) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1857.863914] env[68906]: DEBUG nova.compute.manager [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Allocating IP information in the background. {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1857.864101] env[68906]: DEBUG nova.network.neutron [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] allocate_for_instance() {{(pid=68906) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1857.873813] env[68906]: DEBUG nova.compute.manager [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Start building block device mappings for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1857.926440] env[68906]: DEBUG nova.policy [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fa8acbdb3f304f67ba13b02e547844d1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '35ea959a162d451db5103b94bf7da26a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68906) authorize /opt/stack/nova/nova/policy.py:203}} [ 1857.939888] env[68906]: DEBUG nova.compute.manager [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Start spawning the instance on the hypervisor. {{(pid=68906) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1857.964722] env[68906]: DEBUG nova.virt.hardware [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T13:00:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T13:00:23Z,direct_url=,disk_format='vmdk',id=b1400c31-d33b-4e13-944f-4c645e62493e,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='1ae7bf3a375d41c6af5e7536af51ffd1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T13:00:24Z,virtual_size=,visibility=), allow threads: False {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1857.964968] env[68906]: DEBUG nova.virt.hardware [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Flavor limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1857.965141] env[68906]: DEBUG nova.virt.hardware [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Image limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1857.965326] env[68906]: DEBUG nova.virt.hardware [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Flavor pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1857.965474] env[68906]: DEBUG nova.virt.hardware [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Image pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1857.965623] env[68906]: DEBUG nova.virt.hardware [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1857.965836] env[68906]: DEBUG nova.virt.hardware [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1857.965994] env[68906]: DEBUG nova.virt.hardware [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1857.966178] env[68906]: DEBUG nova.virt.hardware [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Got 1 possible topologies {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1857.966346] env[68906]: DEBUG nova.virt.hardware [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1857.966518] env[68906]: DEBUG nova.virt.hardware [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1857.967404] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0248fc15-3a81-4695-b080-1f57f314e44a {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1857.975746] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a435bdbd-4ef6-409d-9873-99895b0847c2 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1858.228654] env[68906]: DEBUG nova.network.neutron [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Successfully created port: d8ebeccb-5265-44e7-88b2-6c8dd1ab2bdf {{(pid=68906) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1858.932207] env[68906]: DEBUG nova.network.neutron [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Successfully updated port: d8ebeccb-5265-44e7-88b2-6c8dd1ab2bdf {{(pid=68906) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1858.945482] env[68906]: DEBUG oslo_concurrency.lockutils [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Acquiring lock "refresh_cache-3cfde5a7-3148-426c-8867-ffafb33dc95b" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1858.945693] env[68906]: DEBUG oslo_concurrency.lockutils [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Acquired lock "refresh_cache-3cfde5a7-3148-426c-8867-ffafb33dc95b" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1858.945782] env[68906]: DEBUG nova.network.neutron [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Building network info cache for instance {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1858.983116] env[68906]: DEBUG nova.network.neutron [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Instance cache missing network info. {{(pid=68906) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1859.135393] env[68906]: DEBUG nova.network.neutron [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Updating instance_info_cache with network_info: [{"id": "d8ebeccb-5265-44e7-88b2-6c8dd1ab2bdf", "address": "fa:16:3e:a2:b2:f7", "network": {"id": "42998f86-911a-4af7-93b7-ffe19e2cd70c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-563323785-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "35ea959a162d451db5103b94bf7da26a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ea4fe416-47a6-4542-b59d-8c71ab4d6503", "external-id": "nsx-vlan-transportzone-369", "segmentation_id": 369, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd8ebeccb-52", "ovs_interfaceid": "d8ebeccb-5265-44e7-88b2-6c8dd1ab2bdf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1859.136664] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1859.147402] env[68906]: DEBUG oslo_concurrency.lockutils [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Releasing lock "refresh_cache-3cfde5a7-3148-426c-8867-ffafb33dc95b" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1859.147692] env[68906]: DEBUG nova.compute.manager [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Instance network_info: |[{"id": "d8ebeccb-5265-44e7-88b2-6c8dd1ab2bdf", "address": "fa:16:3e:a2:b2:f7", "network": {"id": "42998f86-911a-4af7-93b7-ffe19e2cd70c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-563323785-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "35ea959a162d451db5103b94bf7da26a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ea4fe416-47a6-4542-b59d-8c71ab4d6503", "external-id": "nsx-vlan-transportzone-369", "segmentation_id": 369, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd8ebeccb-52", "ovs_interfaceid": "d8ebeccb-5265-44e7-88b2-6c8dd1ab2bdf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1859.148086] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:a2:b2:f7', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ea4fe416-47a6-4542-b59d-8c71ab4d6503', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'd8ebeccb-5265-44e7-88b2-6c8dd1ab2bdf', 'vif_model': 'vmxnet3'}] {{(pid=68906) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1859.156016] env[68906]: DEBUG oslo.service.loopingcall [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1859.156488] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Creating VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1859.156718] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-b37e74b8-6081-4966-aded-f0b708a6003b {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1859.177244] env[68906]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1859.177244] env[68906]: value = "task-3475441" [ 1859.177244] env[68906]: _type = "Task" [ 1859.177244] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1859.184884] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475441, 'name': CreateVM_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1859.369955] env[68906]: DEBUG nova.compute.manager [req-89ea94b3-a70d-4e13-95b9-e875d3c9dd83 req-d4b9eec1-4829-46d8-afd1-a7d563b186e1 service nova] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Received event network-vif-plugged-d8ebeccb-5265-44e7-88b2-6c8dd1ab2bdf {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1859.370336] env[68906]: DEBUG oslo_concurrency.lockutils [req-89ea94b3-a70d-4e13-95b9-e875d3c9dd83 req-d4b9eec1-4829-46d8-afd1-a7d563b186e1 service nova] Acquiring lock "3cfde5a7-3148-426c-8867-ffafb33dc95b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1859.370495] env[68906]: DEBUG oslo_concurrency.lockutils [req-89ea94b3-a70d-4e13-95b9-e875d3c9dd83 req-d4b9eec1-4829-46d8-afd1-a7d563b186e1 service nova] Lock "3cfde5a7-3148-426c-8867-ffafb33dc95b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1859.370731] env[68906]: DEBUG oslo_concurrency.lockutils [req-89ea94b3-a70d-4e13-95b9-e875d3c9dd83 req-d4b9eec1-4829-46d8-afd1-a7d563b186e1 service nova] Lock "3cfde5a7-3148-426c-8867-ffafb33dc95b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1859.370805] env[68906]: DEBUG nova.compute.manager [req-89ea94b3-a70d-4e13-95b9-e875d3c9dd83 req-d4b9eec1-4829-46d8-afd1-a7d563b186e1 service nova] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] No waiting events found dispatching network-vif-plugged-d8ebeccb-5265-44e7-88b2-6c8dd1ab2bdf {{(pid=68906) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1859.370968] env[68906]: WARNING nova.compute.manager [req-89ea94b3-a70d-4e13-95b9-e875d3c9dd83 req-d4b9eec1-4829-46d8-afd1-a7d563b186e1 service nova] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Received unexpected event network-vif-plugged-d8ebeccb-5265-44e7-88b2-6c8dd1ab2bdf for instance with vm_state building and task_state spawning. [ 1859.371172] env[68906]: DEBUG nova.compute.manager [req-89ea94b3-a70d-4e13-95b9-e875d3c9dd83 req-d4b9eec1-4829-46d8-afd1-a7d563b186e1 service nova] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Received event network-changed-d8ebeccb-5265-44e7-88b2-6c8dd1ab2bdf {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1859.371340] env[68906]: DEBUG nova.compute.manager [req-89ea94b3-a70d-4e13-95b9-e875d3c9dd83 req-d4b9eec1-4829-46d8-afd1-a7d563b186e1 service nova] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Refreshing instance network info cache due to event network-changed-d8ebeccb-5265-44e7-88b2-6c8dd1ab2bdf. {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1859.371522] env[68906]: DEBUG oslo_concurrency.lockutils [req-89ea94b3-a70d-4e13-95b9-e875d3c9dd83 req-d4b9eec1-4829-46d8-afd1-a7d563b186e1 service nova] Acquiring lock "refresh_cache-3cfde5a7-3148-426c-8867-ffafb33dc95b" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1859.371732] env[68906]: DEBUG oslo_concurrency.lockutils [req-89ea94b3-a70d-4e13-95b9-e875d3c9dd83 req-d4b9eec1-4829-46d8-afd1-a7d563b186e1 service nova] Acquired lock "refresh_cache-3cfde5a7-3148-426c-8867-ffafb33dc95b" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1859.371901] env[68906]: DEBUG nova.network.neutron [req-89ea94b3-a70d-4e13-95b9-e875d3c9dd83 req-d4b9eec1-4829-46d8-afd1-a7d563b186e1 service nova] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Refreshing network info cache for port d8ebeccb-5265-44e7-88b2-6c8dd1ab2bdf {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1859.611684] env[68906]: DEBUG nova.network.neutron [req-89ea94b3-a70d-4e13-95b9-e875d3c9dd83 req-d4b9eec1-4829-46d8-afd1-a7d563b186e1 service nova] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Updated VIF entry in instance network info cache for port d8ebeccb-5265-44e7-88b2-6c8dd1ab2bdf. {{(pid=68906) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1859.612075] env[68906]: DEBUG nova.network.neutron [req-89ea94b3-a70d-4e13-95b9-e875d3c9dd83 req-d4b9eec1-4829-46d8-afd1-a7d563b186e1 service nova] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Updating instance_info_cache with network_info: [{"id": "d8ebeccb-5265-44e7-88b2-6c8dd1ab2bdf", "address": "fa:16:3e:a2:b2:f7", "network": {"id": "42998f86-911a-4af7-93b7-ffe19e2cd70c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-563323785-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "35ea959a162d451db5103b94bf7da26a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ea4fe416-47a6-4542-b59d-8c71ab4d6503", "external-id": "nsx-vlan-transportzone-369", "segmentation_id": 369, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd8ebeccb-52", "ovs_interfaceid": "d8ebeccb-5265-44e7-88b2-6c8dd1ab2bdf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1859.627533] env[68906]: DEBUG oslo_concurrency.lockutils [req-89ea94b3-a70d-4e13-95b9-e875d3c9dd83 req-d4b9eec1-4829-46d8-afd1-a7d563b186e1 service nova] Releasing lock "refresh_cache-3cfde5a7-3148-426c-8867-ffafb33dc95b" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1859.686937] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475441, 'name': CreateVM_Task, 'duration_secs': 0.314717} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1859.687126] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Created VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1859.687727] env[68906]: DEBUG oslo_concurrency.lockutils [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1859.687892] env[68906]: DEBUG oslo_concurrency.lockutils [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1859.688223] env[68906]: DEBUG oslo_concurrency.lockutils [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1859.688466] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a318ef9a-d358-4bcd-ab53-661cffb1fcfb {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1859.692655] env[68906]: DEBUG oslo_vmware.api [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Waiting for the task: (returnval){ [ 1859.692655] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]52d21409-0743-f6a6-fe07-7c0764e1c1f3" [ 1859.692655] env[68906]: _type = "Task" [ 1859.692655] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1859.699858] env[68906]: DEBUG oslo_vmware.api [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]52d21409-0743-f6a6-fe07-7c0764e1c1f3, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1860.203099] env[68906]: DEBUG oslo_concurrency.lockutils [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1860.203398] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Processing image b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1860.203567] env[68906]: DEBUG oslo_concurrency.lockutils [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1861.140390] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager.update_available_resource {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1861.152494] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1861.152747] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1861.152964] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1861.153148] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68906) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1861.154265] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78b8f980-0b15-43e4-b635-201c2b5b2382 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1861.163556] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-76f5ed01-8723-43bb-a465-3ea7878c9537 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1861.177383] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca713b66-860a-40f0-8c79-2120d77f98a7 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1861.183614] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-395b3272-0f20-4fb2-bffa-32d7890b4f02 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1861.214115] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180917MB free_disk=93GB free_vcpus=48 pci_devices=None {{(pid=68906) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1861.214372] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1861.214481] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1861.286163] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 9b884416-df89-4d8c-b2ab-0667db52a718 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1861.286336] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance aed06616-d008-4695-b66e-9f40acf5ebd3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1861.286465] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 17327bc3-433e-4006-93c7-e53714ed70c2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1861.286601] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 32f5b54d-30bf-4fe9-9622-3ff74344b3f3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1861.286701] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 922d81ba-c8d2-43ba-b1c5-f2943418d6a2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1861.286817] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 736db39c-e5e5-4a54-b85a-aa5c703f432e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1861.286933] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance ce6e5cd6-efb8-46d1-811d-74c084661cce actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1861.287058] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 7994d291-b4bf-48f5-ad34-c1f484d77f6e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1861.287175] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 860248ea-e77b-4ff6-af64-b75f88a31348 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1861.287285] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 3cfde5a7-3148-426c-8867-ffafb33dc95b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1861.297905] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 709defd2-4089-410e-b317-c41c97e01f62 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1861.311194] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 01b79dfa-cd20-495d-b112-8429c28b741e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1861.321306] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 8bfc91d4-b1d7-449a-8d48-0e63490fe663 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1861.331487] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance d70b039d-c8ad-4ffd-84f8-08f17cb97578 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1861.331767] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1861.331969] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1861.499500] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c2a2e17-26b2-415a-aacb-210e16f31e56 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1861.507399] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e018d262-b8e3-4580-a938-187f134858e8 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1861.537246] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d190c9c-0ea9-4d86-8277-7b8d8b7756a8 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1861.544184] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-41a0e665-2b35-4757-a28d-2b236db7526a {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1861.557082] env[68906]: DEBUG nova.compute.provider_tree [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1861.566680] env[68906]: DEBUG nova.scheduler.client.report [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1861.579786] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68906) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1861.579974] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.365s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1874.752210] env[68906]: DEBUG oslo_concurrency.lockutils [None req-823968b1-da09-4e0a-ab45-09a081b0e509 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Acquiring lock "3cfde5a7-3148-426c-8867-ffafb33dc95b" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1882.260124] env[68906]: DEBUG oslo_concurrency.lockutils [None req-a0bccc79-5040-43c0-84e3-cf4ab89f848f tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] Acquiring lock "ed276c3c-6085-427d-b3b7-86bbb8660dbc" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1882.260444] env[68906]: DEBUG oslo_concurrency.lockutils [None req-a0bccc79-5040-43c0-84e3-cf4ab89f848f tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] Lock "ed276c3c-6085-427d-b3b7-86bbb8660dbc" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1902.916882] env[68906]: WARNING oslo_vmware.rw_handles [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1902.916882] env[68906]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1902.916882] env[68906]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1902.916882] env[68906]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1902.916882] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1902.916882] env[68906]: ERROR oslo_vmware.rw_handles response.begin() [ 1902.916882] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1902.916882] env[68906]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1902.916882] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1902.916882] env[68906]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1902.916882] env[68906]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1902.916882] env[68906]: ERROR oslo_vmware.rw_handles [ 1902.918269] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Downloaded image file data b1400c31-d33b-4e13-944f-4c645e62493e to vmware_temp/a80c5472-2f66-41fa-af34-9b95f1444da8/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1902.919508] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Caching image {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1902.919771] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Copying Virtual Disk [datastore2] vmware_temp/a80c5472-2f66-41fa-af34-9b95f1444da8/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk to [datastore2] vmware_temp/a80c5472-2f66-41fa-af34-9b95f1444da8/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk {{(pid=68906) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1902.920076] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-88c0ce2c-0c2c-4ff8-afc3-407c9f88b9af {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1902.927529] env[68906]: DEBUG oslo_vmware.api [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Waiting for the task: (returnval){ [ 1902.927529] env[68906]: value = "task-3475442" [ 1902.927529] env[68906]: _type = "Task" [ 1902.927529] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1902.935452] env[68906]: DEBUG oslo_vmware.api [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Task: {'id': task-3475442, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1903.439424] env[68906]: DEBUG oslo_vmware.exceptions [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Fault InvalidArgument not matched. {{(pid=68906) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1903.439766] env[68906]: DEBUG oslo_concurrency.lockutils [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1903.440382] env[68906]: ERROR nova.compute.manager [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1903.440382] env[68906]: Faults: ['InvalidArgument'] [ 1903.440382] env[68906]: ERROR nova.compute.manager [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Traceback (most recent call last): [ 1903.440382] env[68906]: ERROR nova.compute.manager [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1903.440382] env[68906]: ERROR nova.compute.manager [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] yield resources [ 1903.440382] env[68906]: ERROR nova.compute.manager [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1903.440382] env[68906]: ERROR nova.compute.manager [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] self.driver.spawn(context, instance, image_meta, [ 1903.440382] env[68906]: ERROR nova.compute.manager [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1903.440382] env[68906]: ERROR nova.compute.manager [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1903.440382] env[68906]: ERROR nova.compute.manager [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1903.440382] env[68906]: ERROR nova.compute.manager [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] self._fetch_image_if_missing(context, vi) [ 1903.440382] env[68906]: ERROR nova.compute.manager [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1903.440805] env[68906]: ERROR nova.compute.manager [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] image_cache(vi, tmp_image_ds_loc) [ 1903.440805] env[68906]: ERROR nova.compute.manager [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1903.440805] env[68906]: ERROR nova.compute.manager [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] vm_util.copy_virtual_disk( [ 1903.440805] env[68906]: ERROR nova.compute.manager [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1903.440805] env[68906]: ERROR nova.compute.manager [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] session._wait_for_task(vmdk_copy_task) [ 1903.440805] env[68906]: ERROR nova.compute.manager [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1903.440805] env[68906]: ERROR nova.compute.manager [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] return self.wait_for_task(task_ref) [ 1903.440805] env[68906]: ERROR nova.compute.manager [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1903.440805] env[68906]: ERROR nova.compute.manager [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] return evt.wait() [ 1903.440805] env[68906]: ERROR nova.compute.manager [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1903.440805] env[68906]: ERROR nova.compute.manager [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] result = hub.switch() [ 1903.440805] env[68906]: ERROR nova.compute.manager [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1903.440805] env[68906]: ERROR nova.compute.manager [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] return self.greenlet.switch() [ 1903.441253] env[68906]: ERROR nova.compute.manager [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1903.441253] env[68906]: ERROR nova.compute.manager [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] self.f(*self.args, **self.kw) [ 1903.441253] env[68906]: ERROR nova.compute.manager [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1903.441253] env[68906]: ERROR nova.compute.manager [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] raise exceptions.translate_fault(task_info.error) [ 1903.441253] env[68906]: ERROR nova.compute.manager [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1903.441253] env[68906]: ERROR nova.compute.manager [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Faults: ['InvalidArgument'] [ 1903.441253] env[68906]: ERROR nova.compute.manager [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] [ 1903.441253] env[68906]: INFO nova.compute.manager [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Terminating instance [ 1903.442688] env[68906]: DEBUG oslo_concurrency.lockutils [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1903.442999] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1903.443290] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d243db39-3c1e-488e-a554-31391acc0dbd {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1903.445701] env[68906]: DEBUG nova.compute.manager [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1903.445941] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1903.446704] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24c92ae3-a4b6-4db9-bcda-d4867f560747 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1903.454596] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Unregistering the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1903.455723] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-29c15e38-9d9a-476f-b06a-5a7de3c26706 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1903.457330] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1903.457490] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68906) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1903.458200] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-cccfc8e8-f166-4a80-9f21-38278e862c4c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1903.463180] env[68906]: DEBUG oslo_vmware.api [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Waiting for the task: (returnval){ [ 1903.463180] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]52f016d6-1604-e70e-1c84-016b43a11432" [ 1903.463180] env[68906]: _type = "Task" [ 1903.463180] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1903.470702] env[68906]: DEBUG oslo_vmware.api [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]52f016d6-1604-e70e-1c84-016b43a11432, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1903.529121] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Unregistered the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1903.529422] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Deleting contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1903.529650] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Deleting the datastore file [datastore2] 9b884416-df89-4d8c-b2ab-0667db52a718 {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1903.529943] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-005e3ae2-b6a8-4e15-9df1-435655ee519a {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1903.536125] env[68906]: DEBUG oslo_vmware.api [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Waiting for the task: (returnval){ [ 1903.536125] env[68906]: value = "task-3475444" [ 1903.536125] env[68906]: _type = "Task" [ 1903.536125] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1903.543391] env[68906]: DEBUG oslo_vmware.api [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Task: {'id': task-3475444, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1903.973425] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Preparing fetch location {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1903.973752] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Creating directory with path [datastore2] vmware_temp/8ea48b0f-1bc1-4cfe-8dee-96dc72209bdf/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1903.973928] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-18785f96-3664-43da-9cca-c137bff092f2 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1903.984827] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Created directory with path [datastore2] vmware_temp/8ea48b0f-1bc1-4cfe-8dee-96dc72209bdf/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1903.985032] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Fetch image to [datastore2] vmware_temp/8ea48b0f-1bc1-4cfe-8dee-96dc72209bdf/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1903.985212] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to [datastore2] vmware_temp/8ea48b0f-1bc1-4cfe-8dee-96dc72209bdf/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1903.985915] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d83cac4-0a1f-4f8c-808a-20fbc2fe7b47 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1903.992074] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7227c53-91fe-4e32-bcfd-6b5219d8d647 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1904.000896] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-890d569d-fdfa-407f-9efa-21fcdfca322b {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1904.030469] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-476b793e-60a0-49be-9889-44e307c61ae1 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1904.035544] env[68906]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-dbe8637a-18d5-41b0-9821-435c600c197d {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1904.044739] env[68906]: DEBUG oslo_vmware.api [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Task: {'id': task-3475444, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.063197} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1904.044961] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Deleted the datastore file {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1904.045170] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Deleted contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1904.045351] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1904.045526] env[68906]: INFO nova.compute.manager [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1904.047513] env[68906]: DEBUG nova.compute.claims [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Aborting claim: {{(pid=68906) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1904.047684] env[68906]: DEBUG oslo_concurrency.lockutils [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1904.047896] env[68906]: DEBUG oslo_concurrency.lockutils [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1904.060375] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1904.113217] env[68906]: DEBUG oslo_vmware.rw_handles [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/8ea48b0f-1bc1-4cfe-8dee-96dc72209bdf/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1904.173406] env[68906]: DEBUG oslo_vmware.rw_handles [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Completed reading data from the image iterator. {{(pid=68906) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1904.173604] env[68906]: DEBUG oslo_vmware.rw_handles [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/8ea48b0f-1bc1-4cfe-8dee-96dc72209bdf/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1904.301133] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5a452af0-d699-4367-bc4c-d8a0d62f73e2 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1904.308541] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a4b66f1-2378-42c8-ac7f-1db240dcbf92 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1904.337603] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af8527ca-4777-4a91-a64d-42005dd9585d {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1904.345071] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dec61891-8b0d-41fa-9f80-87b1dec415a7 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1904.357355] env[68906]: DEBUG nova.compute.provider_tree [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1904.366317] env[68906]: DEBUG nova.scheduler.client.report [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1904.380566] env[68906]: DEBUG oslo_concurrency.lockutils [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.333s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1904.381141] env[68906]: ERROR nova.compute.manager [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1904.381141] env[68906]: Faults: ['InvalidArgument'] [ 1904.381141] env[68906]: ERROR nova.compute.manager [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Traceback (most recent call last): [ 1904.381141] env[68906]: ERROR nova.compute.manager [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1904.381141] env[68906]: ERROR nova.compute.manager [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] self.driver.spawn(context, instance, image_meta, [ 1904.381141] env[68906]: ERROR nova.compute.manager [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1904.381141] env[68906]: ERROR nova.compute.manager [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1904.381141] env[68906]: ERROR nova.compute.manager [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1904.381141] env[68906]: ERROR nova.compute.manager [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] self._fetch_image_if_missing(context, vi) [ 1904.381141] env[68906]: ERROR nova.compute.manager [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1904.381141] env[68906]: ERROR nova.compute.manager [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] image_cache(vi, tmp_image_ds_loc) [ 1904.381141] env[68906]: ERROR nova.compute.manager [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1904.381567] env[68906]: ERROR nova.compute.manager [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] vm_util.copy_virtual_disk( [ 1904.381567] env[68906]: ERROR nova.compute.manager [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1904.381567] env[68906]: ERROR nova.compute.manager [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] session._wait_for_task(vmdk_copy_task) [ 1904.381567] env[68906]: ERROR nova.compute.manager [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1904.381567] env[68906]: ERROR nova.compute.manager [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] return self.wait_for_task(task_ref) [ 1904.381567] env[68906]: ERROR nova.compute.manager [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1904.381567] env[68906]: ERROR nova.compute.manager [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] return evt.wait() [ 1904.381567] env[68906]: ERROR nova.compute.manager [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1904.381567] env[68906]: ERROR nova.compute.manager [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] result = hub.switch() [ 1904.381567] env[68906]: ERROR nova.compute.manager [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1904.381567] env[68906]: ERROR nova.compute.manager [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] return self.greenlet.switch() [ 1904.381567] env[68906]: ERROR nova.compute.manager [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1904.381567] env[68906]: ERROR nova.compute.manager [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] self.f(*self.args, **self.kw) [ 1904.381980] env[68906]: ERROR nova.compute.manager [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1904.381980] env[68906]: ERROR nova.compute.manager [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] raise exceptions.translate_fault(task_info.error) [ 1904.381980] env[68906]: ERROR nova.compute.manager [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1904.381980] env[68906]: ERROR nova.compute.manager [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Faults: ['InvalidArgument'] [ 1904.381980] env[68906]: ERROR nova.compute.manager [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] [ 1904.381980] env[68906]: DEBUG nova.compute.utils [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] VimFaultException {{(pid=68906) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1904.383284] env[68906]: DEBUG nova.compute.manager [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Build of instance 9b884416-df89-4d8c-b2ab-0667db52a718 was re-scheduled: A specified parameter was not correct: fileType [ 1904.383284] env[68906]: Faults: ['InvalidArgument'] {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1904.383653] env[68906]: DEBUG nova.compute.manager [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Unplugging VIFs for instance {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1904.383828] env[68906]: DEBUG nova.compute.manager [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1904.384006] env[68906]: DEBUG nova.compute.manager [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1904.384185] env[68906]: DEBUG nova.network.neutron [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1904.706727] env[68906]: DEBUG nova.network.neutron [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1904.716497] env[68906]: INFO nova.compute.manager [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Took 0.33 seconds to deallocate network for instance. [ 1904.801259] env[68906]: INFO nova.scheduler.client.report [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Deleted allocations for instance 9b884416-df89-4d8c-b2ab-0667db52a718 [ 1904.820561] env[68906]: DEBUG oslo_concurrency.lockutils [None req-82d3bc4e-32c1-4af2-8d04-b9efe25d88fc tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Lock "9b884416-df89-4d8c-b2ab-0667db52a718" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 669.644s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1904.821755] env[68906]: DEBUG oslo_concurrency.lockutils [None req-d21cc286-8d6d-4956-a84c-6659f57d9db9 tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Lock "9b884416-df89-4d8c-b2ab-0667db52a718" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 473.939s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1904.821981] env[68906]: DEBUG oslo_concurrency.lockutils [None req-d21cc286-8d6d-4956-a84c-6659f57d9db9 tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Acquiring lock "9b884416-df89-4d8c-b2ab-0667db52a718-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1904.822205] env[68906]: DEBUG oslo_concurrency.lockutils [None req-d21cc286-8d6d-4956-a84c-6659f57d9db9 tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Lock "9b884416-df89-4d8c-b2ab-0667db52a718-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1904.822374] env[68906]: DEBUG oslo_concurrency.lockutils [None req-d21cc286-8d6d-4956-a84c-6659f57d9db9 tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Lock "9b884416-df89-4d8c-b2ab-0667db52a718-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1904.824319] env[68906]: INFO nova.compute.manager [None req-d21cc286-8d6d-4956-a84c-6659f57d9db9 tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Terminating instance [ 1904.825937] env[68906]: DEBUG nova.compute.manager [None req-d21cc286-8d6d-4956-a84c-6659f57d9db9 tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1904.826146] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-d21cc286-8d6d-4956-a84c-6659f57d9db9 tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1904.826689] env[68906]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-1b3a8113-b1ed-4cd3-8534-c7f0cac7fcd8 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1904.839045] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af69a8c8-0c97-4202-ac57-1cbe77bed280 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1904.848776] env[68906]: DEBUG nova.compute.manager [None req-67ed4526-96df-48ba-bd9f-4546cfb77ff4 tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] [instance: 709defd2-4089-410e-b317-c41c97e01f62] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1904.869674] env[68906]: WARNING nova.virt.vmwareapi.vmops [None req-d21cc286-8d6d-4956-a84c-6659f57d9db9 tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 9b884416-df89-4d8c-b2ab-0667db52a718 could not be found. [ 1904.869880] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-d21cc286-8d6d-4956-a84c-6659f57d9db9 tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1904.870073] env[68906]: INFO nova.compute.manager [None req-d21cc286-8d6d-4956-a84c-6659f57d9db9 tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1904.870326] env[68906]: DEBUG oslo.service.loopingcall [None req-d21cc286-8d6d-4956-a84c-6659f57d9db9 tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1904.870555] env[68906]: DEBUG nova.compute.manager [-] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1904.870654] env[68906]: DEBUG nova.network.neutron [-] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1904.872976] env[68906]: DEBUG nova.compute.manager [None req-67ed4526-96df-48ba-bd9f-4546cfb77ff4 tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] [instance: 709defd2-4089-410e-b317-c41c97e01f62] Instance disappeared before build. {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1904.891215] env[68906]: DEBUG oslo_concurrency.lockutils [None req-67ed4526-96df-48ba-bd9f-4546cfb77ff4 tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] Lock "709defd2-4089-410e-b317-c41c97e01f62" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 219.302s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1904.895853] env[68906]: DEBUG nova.network.neutron [-] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1904.901514] env[68906]: DEBUG nova.compute.manager [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1904.904471] env[68906]: INFO nova.compute.manager [-] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] Took 0.03 seconds to deallocate network for instance. [ 1904.950151] env[68906]: DEBUG oslo_concurrency.lockutils [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1904.950402] env[68906]: DEBUG oslo_concurrency.lockutils [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1904.951840] env[68906]: INFO nova.compute.claims [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1904.985191] env[68906]: DEBUG oslo_concurrency.lockutils [None req-d21cc286-8d6d-4956-a84c-6659f57d9db9 tempest-ListServersNegativeTestJSON-844965327 tempest-ListServersNegativeTestJSON-844965327-project-member] Lock "9b884416-df89-4d8c-b2ab-0667db52a718" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.163s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1904.986044] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "9b884416-df89-4d8c-b2ab-0667db52a718" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 81.816s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1904.986162] env[68906]: INFO nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 9b884416-df89-4d8c-b2ab-0667db52a718] During sync_power_state the instance has a pending task (deleting). Skip. [ 1904.986335] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "9b884416-df89-4d8c-b2ab-0667db52a718" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1905.133527] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f74b5347-f9ff-4172-9043-9cc1465bed1c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1905.141114] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-906eb910-c9c8-4d1a-9f99-14702375337e {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1905.171779] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e898d5d-542b-4207-9b01-5803940cab45 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1905.179205] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7018a771-a0c2-4958-ae20-86140b40af17 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1905.192829] env[68906]: DEBUG nova.compute.provider_tree [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1905.202421] env[68906]: DEBUG nova.scheduler.client.report [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1905.215086] env[68906]: DEBUG oslo_concurrency.lockutils [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.265s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1905.215555] env[68906]: DEBUG nova.compute.manager [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Start building networks asynchronously for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1905.247044] env[68906]: DEBUG nova.compute.utils [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Using /dev/sd instead of None {{(pid=68906) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1905.248314] env[68906]: DEBUG nova.compute.manager [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Allocating IP information in the background. {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1905.248485] env[68906]: DEBUG nova.network.neutron [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] allocate_for_instance() {{(pid=68906) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1905.256848] env[68906]: DEBUG nova.compute.manager [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Start building block device mappings for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1905.311356] env[68906]: DEBUG nova.policy [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '48a87efb847142f0ad02ed71d0f99c5b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'dbc2d99be8d24010ad35cac91bd08ff2', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68906) authorize /opt/stack/nova/nova/policy.py:203}} [ 1905.316291] env[68906]: DEBUG nova.compute.manager [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Start spawning the instance on the hypervisor. {{(pid=68906) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1905.340297] env[68906]: DEBUG nova.virt.hardware [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T13:00:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T13:00:23Z,direct_url=,disk_format='vmdk',id=b1400c31-d33b-4e13-944f-4c645e62493e,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='1ae7bf3a375d41c6af5e7536af51ffd1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T13:00:24Z,virtual_size=,visibility=), allow threads: False {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1905.340549] env[68906]: DEBUG nova.virt.hardware [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Flavor limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1905.340705] env[68906]: DEBUG nova.virt.hardware [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Image limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1905.340905] env[68906]: DEBUG nova.virt.hardware [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Flavor pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1905.341077] env[68906]: DEBUG nova.virt.hardware [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Image pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1905.341228] env[68906]: DEBUG nova.virt.hardware [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1905.341431] env[68906]: DEBUG nova.virt.hardware [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1905.341588] env[68906]: DEBUG nova.virt.hardware [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1905.341752] env[68906]: DEBUG nova.virt.hardware [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Got 1 possible topologies {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1905.341911] env[68906]: DEBUG nova.virt.hardware [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1905.342096] env[68906]: DEBUG nova.virt.hardware [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1905.342935] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d8a910b-fd4f-4b7a-9dc9-a1e109494a32 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1905.353240] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e81a685b-1728-4a7a-acce-cba478081d36 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1905.638393] env[68906]: DEBUG nova.network.neutron [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Successfully created port: d91083d7-c378-4e16-a3fc-83159ba3b412 {{(pid=68906) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1906.182131] env[68906]: DEBUG nova.network.neutron [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Successfully updated port: d91083d7-c378-4e16-a3fc-83159ba3b412 {{(pid=68906) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1906.192422] env[68906]: DEBUG oslo_concurrency.lockutils [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Acquiring lock "refresh_cache-01b79dfa-cd20-495d-b112-8429c28b741e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1906.193348] env[68906]: DEBUG oslo_concurrency.lockutils [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Acquired lock "refresh_cache-01b79dfa-cd20-495d-b112-8429c28b741e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1906.193348] env[68906]: DEBUG nova.network.neutron [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Building network info cache for instance {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1906.233912] env[68906]: DEBUG nova.network.neutron [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Instance cache missing network info. {{(pid=68906) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1906.387086] env[68906]: DEBUG nova.network.neutron [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Updating instance_info_cache with network_info: [{"id": "d91083d7-c378-4e16-a3fc-83159ba3b412", "address": "fa:16:3e:a9:34:48", "network": {"id": "a338fedc-7b16-4cf6-a1c8-41fa375260b5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-848536591-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "dbc2d99be8d24010ad35cac91bd08ff2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "98011432-48cc-4ffd-a5a8-b96d2ea4424a", "external-id": "nsx-vlan-transportzone-745", "segmentation_id": 745, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd91083d7-c3", "ovs_interfaceid": "d91083d7-c378-4e16-a3fc-83159ba3b412", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1906.399659] env[68906]: DEBUG oslo_concurrency.lockutils [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Releasing lock "refresh_cache-01b79dfa-cd20-495d-b112-8429c28b741e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1906.399929] env[68906]: DEBUG nova.compute.manager [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Instance network_info: |[{"id": "d91083d7-c378-4e16-a3fc-83159ba3b412", "address": "fa:16:3e:a9:34:48", "network": {"id": "a338fedc-7b16-4cf6-a1c8-41fa375260b5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-848536591-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "dbc2d99be8d24010ad35cac91bd08ff2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "98011432-48cc-4ffd-a5a8-b96d2ea4424a", "external-id": "nsx-vlan-transportzone-745", "segmentation_id": 745, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd91083d7-c3", "ovs_interfaceid": "d91083d7-c378-4e16-a3fc-83159ba3b412", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1906.400318] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:a9:34:48', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '98011432-48cc-4ffd-a5a8-b96d2ea4424a', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'd91083d7-c378-4e16-a3fc-83159ba3b412', 'vif_model': 'vmxnet3'}] {{(pid=68906) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1906.407678] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Creating folder: Project (dbc2d99be8d24010ad35cac91bd08ff2). Parent ref: group-v694750. {{(pid=68906) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1906.408172] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9bf58033-67f6-4e92-8b44-5ef571ba8ea4 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1906.419682] env[68906]: INFO nova.virt.vmwareapi.vm_util [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Created folder: Project (dbc2d99be8d24010ad35cac91bd08ff2) in parent group-v694750. [ 1906.419862] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Creating folder: Instances. Parent ref: group-v694851. {{(pid=68906) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1906.420100] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f3fa41de-1660-4094-9700-6839d8735509 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1906.429184] env[68906]: INFO nova.virt.vmwareapi.vm_util [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Created folder: Instances in parent group-v694851. [ 1906.429438] env[68906]: DEBUG oslo.service.loopingcall [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1906.429680] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Creating VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1906.429916] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-33494220-29df-4bf5-8470-9ce449921b13 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1906.451928] env[68906]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1906.451928] env[68906]: value = "task-3475447" [ 1906.451928] env[68906]: _type = "Task" [ 1906.451928] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1906.460601] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475447, 'name': CreateVM_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1906.580262] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1906.838244] env[68906]: DEBUG nova.compute.manager [req-91832d0e-ea32-422d-a2d0-070e04d59105 req-fb7831c5-80f9-494c-9532-9f53450e5294 service nova] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Received event network-vif-plugged-d91083d7-c378-4e16-a3fc-83159ba3b412 {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1906.840220] env[68906]: DEBUG oslo_concurrency.lockutils [req-91832d0e-ea32-422d-a2d0-070e04d59105 req-fb7831c5-80f9-494c-9532-9f53450e5294 service nova] Acquiring lock "01b79dfa-cd20-495d-b112-8429c28b741e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1906.840456] env[68906]: DEBUG oslo_concurrency.lockutils [req-91832d0e-ea32-422d-a2d0-070e04d59105 req-fb7831c5-80f9-494c-9532-9f53450e5294 service nova] Lock "01b79dfa-cd20-495d-b112-8429c28b741e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1906.840628] env[68906]: DEBUG oslo_concurrency.lockutils [req-91832d0e-ea32-422d-a2d0-070e04d59105 req-fb7831c5-80f9-494c-9532-9f53450e5294 service nova] Lock "01b79dfa-cd20-495d-b112-8429c28b741e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1906.840796] env[68906]: DEBUG nova.compute.manager [req-91832d0e-ea32-422d-a2d0-070e04d59105 req-fb7831c5-80f9-494c-9532-9f53450e5294 service nova] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] No waiting events found dispatching network-vif-plugged-d91083d7-c378-4e16-a3fc-83159ba3b412 {{(pid=68906) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1906.840961] env[68906]: WARNING nova.compute.manager [req-91832d0e-ea32-422d-a2d0-070e04d59105 req-fb7831c5-80f9-494c-9532-9f53450e5294 service nova] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Received unexpected event network-vif-plugged-d91083d7-c378-4e16-a3fc-83159ba3b412 for instance with vm_state building and task_state spawning. [ 1906.841168] env[68906]: DEBUG nova.compute.manager [req-91832d0e-ea32-422d-a2d0-070e04d59105 req-fb7831c5-80f9-494c-9532-9f53450e5294 service nova] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Received event network-changed-d91083d7-c378-4e16-a3fc-83159ba3b412 {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1906.841333] env[68906]: DEBUG nova.compute.manager [req-91832d0e-ea32-422d-a2d0-070e04d59105 req-fb7831c5-80f9-494c-9532-9f53450e5294 service nova] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Refreshing instance network info cache due to event network-changed-d91083d7-c378-4e16-a3fc-83159ba3b412. {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1906.841517] env[68906]: DEBUG oslo_concurrency.lockutils [req-91832d0e-ea32-422d-a2d0-070e04d59105 req-fb7831c5-80f9-494c-9532-9f53450e5294 service nova] Acquiring lock "refresh_cache-01b79dfa-cd20-495d-b112-8429c28b741e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1906.841657] env[68906]: DEBUG oslo_concurrency.lockutils [req-91832d0e-ea32-422d-a2d0-070e04d59105 req-fb7831c5-80f9-494c-9532-9f53450e5294 service nova] Acquired lock "refresh_cache-01b79dfa-cd20-495d-b112-8429c28b741e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1906.841815] env[68906]: DEBUG nova.network.neutron [req-91832d0e-ea32-422d-a2d0-070e04d59105 req-fb7831c5-80f9-494c-9532-9f53450e5294 service nova] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Refreshing network info cache for port d91083d7-c378-4e16-a3fc-83159ba3b412 {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1906.961225] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475447, 'name': CreateVM_Task, 'duration_secs': 0.279144} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1906.961459] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Created VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1906.968027] env[68906]: DEBUG oslo_concurrency.lockutils [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1906.968203] env[68906]: DEBUG oslo_concurrency.lockutils [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1906.968527] env[68906]: DEBUG oslo_concurrency.lockutils [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1906.968767] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-cbd2909a-b91d-4e31-aa62-eeb4fe003411 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1906.975598] env[68906]: DEBUG oslo_vmware.api [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Waiting for the task: (returnval){ [ 1906.975598] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]52aa8857-0735-4858-54ca-935367821bfb" [ 1906.975598] env[68906]: _type = "Task" [ 1906.975598] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1906.983226] env[68906]: DEBUG oslo_vmware.api [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]52aa8857-0735-4858-54ca-935367821bfb, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1907.088831] env[68906]: DEBUG nova.network.neutron [req-91832d0e-ea32-422d-a2d0-070e04d59105 req-fb7831c5-80f9-494c-9532-9f53450e5294 service nova] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Updated VIF entry in instance network info cache for port d91083d7-c378-4e16-a3fc-83159ba3b412. {{(pid=68906) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1907.089197] env[68906]: DEBUG nova.network.neutron [req-91832d0e-ea32-422d-a2d0-070e04d59105 req-fb7831c5-80f9-494c-9532-9f53450e5294 service nova] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Updating instance_info_cache with network_info: [{"id": "d91083d7-c378-4e16-a3fc-83159ba3b412", "address": "fa:16:3e:a9:34:48", "network": {"id": "a338fedc-7b16-4cf6-a1c8-41fa375260b5", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-848536591-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "dbc2d99be8d24010ad35cac91bd08ff2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "98011432-48cc-4ffd-a5a8-b96d2ea4424a", "external-id": "nsx-vlan-transportzone-745", "segmentation_id": 745, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd91083d7-c3", "ovs_interfaceid": "d91083d7-c378-4e16-a3fc-83159ba3b412", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1907.099546] env[68906]: DEBUG oslo_concurrency.lockutils [req-91832d0e-ea32-422d-a2d0-070e04d59105 req-fb7831c5-80f9-494c-9532-9f53450e5294 service nova] Releasing lock "refresh_cache-01b79dfa-cd20-495d-b112-8429c28b741e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1907.485693] env[68906]: DEBUG oslo_concurrency.lockutils [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1907.486109] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Processing image b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1907.486163] env[68906]: DEBUG oslo_concurrency.lockutils [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1910.141165] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1910.141534] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Starting heal instance info cache {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1910.141534] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Rebuilding the list of instances to heal {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1910.164437] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1910.164658] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1910.164746] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1910.164858] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1910.164979] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1910.165139] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1910.165414] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1910.165414] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1910.165561] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1910.165612] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1910.165715] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Didn't find any instances for network info cache update. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1910.166265] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1912.141637] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1913.136882] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1914.140594] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1917.140690] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1917.140971] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1917.141103] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68906) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1921.137055] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1922.141042] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager.update_available_resource {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1922.153557] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1922.153794] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1922.153982] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1922.154213] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68906) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1922.155348] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e81d5812-a147-4c44-9f18-2e11ab3bbb64 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1922.164021] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5cb6b0fa-a056-42c1-ad60-ef77a89f9415 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1922.177674] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-34442445-b1be-4eaa-8c25-5c90f77f457d {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1922.183966] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b4ff8ea7-2cb2-4d17-bb71-2dff317b7bcc {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1922.213488] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180947MB free_disk=93GB free_vcpus=48 pci_devices=None {{(pid=68906) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1922.213604] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1922.213795] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1922.284404] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance aed06616-d008-4695-b66e-9f40acf5ebd3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1922.284589] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 17327bc3-433e-4006-93c7-e53714ed70c2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1922.284725] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 32f5b54d-30bf-4fe9-9622-3ff74344b3f3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1922.284849] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 922d81ba-c8d2-43ba-b1c5-f2943418d6a2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1922.284967] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 736db39c-e5e5-4a54-b85a-aa5c703f432e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1922.285097] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance ce6e5cd6-efb8-46d1-811d-74c084661cce actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1922.285214] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 7994d291-b4bf-48f5-ad34-c1f484d77f6e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1922.285327] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 860248ea-e77b-4ff6-af64-b75f88a31348 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1922.285439] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 3cfde5a7-3148-426c-8867-ffafb33dc95b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1922.285551] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 01b79dfa-cd20-495d-b112-8429c28b741e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1922.295583] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 8bfc91d4-b1d7-449a-8d48-0e63490fe663 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1922.305345] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance d70b039d-c8ad-4ffd-84f8-08f17cb97578 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1922.313971] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance ed276c3c-6085-427d-b3b7-86bbb8660dbc has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1922.314201] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1922.314348] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1922.455073] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4d0dff9d-d746-4b88-8b34-662f15ef8b82 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1922.462745] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a8273b91-3299-4b15-80a4-270b3f2e462c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1922.492542] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-337caf76-6818-49ed-905a-c3c77a14ff74 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1922.499123] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-deb5f78b-8c3b-401f-be6b-50a5a1821bbf {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1922.511706] env[68906]: DEBUG nova.compute.provider_tree [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1922.520015] env[68906]: DEBUG nova.scheduler.client.report [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1922.533585] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68906) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1922.533769] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.320s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1950.938525] env[68906]: WARNING oslo_vmware.rw_handles [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1950.938525] env[68906]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1950.938525] env[68906]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1950.938525] env[68906]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1950.938525] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1950.938525] env[68906]: ERROR oslo_vmware.rw_handles response.begin() [ 1950.938525] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1950.938525] env[68906]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1950.938525] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1950.938525] env[68906]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1950.938525] env[68906]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1950.938525] env[68906]: ERROR oslo_vmware.rw_handles [ 1950.939283] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Downloaded image file data b1400c31-d33b-4e13-944f-4c645e62493e to vmware_temp/8ea48b0f-1bc1-4cfe-8dee-96dc72209bdf/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1950.941276] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Caching image {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1950.941536] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Copying Virtual Disk [datastore2] vmware_temp/8ea48b0f-1bc1-4cfe-8dee-96dc72209bdf/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk to [datastore2] vmware_temp/8ea48b0f-1bc1-4cfe-8dee-96dc72209bdf/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk {{(pid=68906) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1950.941838] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-a7cac436-c652-477c-9a61-c1f08f57523b {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1950.951240] env[68906]: DEBUG oslo_vmware.api [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Waiting for the task: (returnval){ [ 1950.951240] env[68906]: value = "task-3475448" [ 1950.951240] env[68906]: _type = "Task" [ 1950.951240] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1950.959464] env[68906]: DEBUG oslo_vmware.api [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Task: {'id': task-3475448, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1951.461992] env[68906]: DEBUG oslo_vmware.exceptions [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Fault InvalidArgument not matched. {{(pid=68906) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1951.461992] env[68906]: DEBUG oslo_concurrency.lockutils [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1951.462593] env[68906]: ERROR nova.compute.manager [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1951.462593] env[68906]: Faults: ['InvalidArgument'] [ 1951.462593] env[68906]: ERROR nova.compute.manager [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Traceback (most recent call last): [ 1951.462593] env[68906]: ERROR nova.compute.manager [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1951.462593] env[68906]: ERROR nova.compute.manager [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] yield resources [ 1951.462593] env[68906]: ERROR nova.compute.manager [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1951.462593] env[68906]: ERROR nova.compute.manager [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] self.driver.spawn(context, instance, image_meta, [ 1951.462593] env[68906]: ERROR nova.compute.manager [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1951.462593] env[68906]: ERROR nova.compute.manager [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1951.462593] env[68906]: ERROR nova.compute.manager [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1951.462593] env[68906]: ERROR nova.compute.manager [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] self._fetch_image_if_missing(context, vi) [ 1951.462593] env[68906]: ERROR nova.compute.manager [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1951.462593] env[68906]: ERROR nova.compute.manager [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] image_cache(vi, tmp_image_ds_loc) [ 1951.463103] env[68906]: ERROR nova.compute.manager [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1951.463103] env[68906]: ERROR nova.compute.manager [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] vm_util.copy_virtual_disk( [ 1951.463103] env[68906]: ERROR nova.compute.manager [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1951.463103] env[68906]: ERROR nova.compute.manager [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] session._wait_for_task(vmdk_copy_task) [ 1951.463103] env[68906]: ERROR nova.compute.manager [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1951.463103] env[68906]: ERROR nova.compute.manager [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] return self.wait_for_task(task_ref) [ 1951.463103] env[68906]: ERROR nova.compute.manager [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1951.463103] env[68906]: ERROR nova.compute.manager [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] return evt.wait() [ 1951.463103] env[68906]: ERROR nova.compute.manager [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1951.463103] env[68906]: ERROR nova.compute.manager [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] result = hub.switch() [ 1951.463103] env[68906]: ERROR nova.compute.manager [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1951.463103] env[68906]: ERROR nova.compute.manager [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] return self.greenlet.switch() [ 1951.463103] env[68906]: ERROR nova.compute.manager [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1951.463505] env[68906]: ERROR nova.compute.manager [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] self.f(*self.args, **self.kw) [ 1951.463505] env[68906]: ERROR nova.compute.manager [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1951.463505] env[68906]: ERROR nova.compute.manager [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] raise exceptions.translate_fault(task_info.error) [ 1951.463505] env[68906]: ERROR nova.compute.manager [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1951.463505] env[68906]: ERROR nova.compute.manager [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Faults: ['InvalidArgument'] [ 1951.463505] env[68906]: ERROR nova.compute.manager [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] [ 1951.463505] env[68906]: INFO nova.compute.manager [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Terminating instance [ 1951.464623] env[68906]: DEBUG oslo_concurrency.lockutils [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1951.464985] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1951.465091] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-105ca8c4-dee9-47c2-940c-51a54b25a1ac {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1951.467394] env[68906]: DEBUG nova.compute.manager [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1951.467590] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1951.468354] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-21ad6656-6e21-42fc-ab18-05dcf109f6f1 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1951.475560] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Unregistering the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1951.475825] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-ddb0330f-6296-41f0-934a-5795d05e499c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1951.478226] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1951.478426] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68906) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1951.479429] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9b8e4f59-d885-4e25-b810-98549a26066c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1951.484566] env[68906]: DEBUG oslo_vmware.api [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Waiting for the task: (returnval){ [ 1951.484566] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]52018859-2c44-9b36-748b-c02f660e164e" [ 1951.484566] env[68906]: _type = "Task" [ 1951.484566] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1951.492523] env[68906]: DEBUG oslo_vmware.api [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]52018859-2c44-9b36-748b-c02f660e164e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1951.554703] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Unregistered the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1951.554703] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Deleting contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1951.554703] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Deleting the datastore file [datastore2] aed06616-d008-4695-b66e-9f40acf5ebd3 {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1951.554703] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-f1e79759-ac2e-45d2-9bc1-aa66275858ba {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1951.560734] env[68906]: DEBUG oslo_vmware.api [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Waiting for the task: (returnval){ [ 1951.560734] env[68906]: value = "task-3475450" [ 1951.560734] env[68906]: _type = "Task" [ 1951.560734] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1951.568381] env[68906]: DEBUG oslo_vmware.api [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Task: {'id': task-3475450, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1951.995057] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Preparing fetch location {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1951.995377] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Creating directory with path [datastore2] vmware_temp/faee50ba-fae6-40d2-ac55-b02fd17cdf54/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1951.995561] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-33af9377-ad3e-4e4a-ac97-e3b3eea4ed5d {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1952.006531] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Created directory with path [datastore2] vmware_temp/faee50ba-fae6-40d2-ac55-b02fd17cdf54/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1952.006729] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Fetch image to [datastore2] vmware_temp/faee50ba-fae6-40d2-ac55-b02fd17cdf54/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1952.006897] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to [datastore2] vmware_temp/faee50ba-fae6-40d2-ac55-b02fd17cdf54/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1952.007657] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7aa50314-69d2-4f23-93ae-cdf4ccaa9f64 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1952.013751] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf8a8e63-79ab-4599-93f0-c19debbef921 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1952.022632] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db23c810-423b-427b-ba0f-64f0cbe3734b {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1952.053565] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab47c648-59e2-479b-87d8-2983029fb8db {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1952.058968] env[68906]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-b60cab0a-d4be-4b3c-ad94-e2bf1963e915 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1952.068513] env[68906]: DEBUG oslo_vmware.api [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Task: {'id': task-3475450, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.063035} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1952.068739] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Deleted the datastore file {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1952.068971] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Deleted contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1952.069170] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1952.069346] env[68906]: INFO nova.compute.manager [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1952.071344] env[68906]: DEBUG nova.compute.claims [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Aborting claim: {{(pid=68906) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1952.071517] env[68906]: DEBUG oslo_concurrency.lockutils [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1952.071736] env[68906]: DEBUG oslo_concurrency.lockutils [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1952.077621] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1952.203726] env[68906]: DEBUG oslo_vmware.rw_handles [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/faee50ba-fae6-40d2-ac55-b02fd17cdf54/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1952.263898] env[68906]: DEBUG oslo_vmware.rw_handles [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Completed reading data from the image iterator. {{(pid=68906) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1952.264181] env[68906]: DEBUG oslo_vmware.rw_handles [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/faee50ba-fae6-40d2-ac55-b02fd17cdf54/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1952.324123] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28c20dfd-d594-4094-ae8c-93d4d7c12bc8 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1952.331371] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3626b95c-ac57-49f4-979a-fd3703540b81 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1952.360127] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2dbe3fb3-3f49-484e-9732-591ab042e5d1 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1952.366828] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eea87d17-302f-4c58-8d9c-2622884ec36e {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1952.380904] env[68906]: DEBUG nova.compute.provider_tree [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1952.388968] env[68906]: DEBUG nova.scheduler.client.report [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1952.404443] env[68906]: DEBUG oslo_concurrency.lockutils [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.333s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1952.405062] env[68906]: ERROR nova.compute.manager [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1952.405062] env[68906]: Faults: ['InvalidArgument'] [ 1952.405062] env[68906]: ERROR nova.compute.manager [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Traceback (most recent call last): [ 1952.405062] env[68906]: ERROR nova.compute.manager [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1952.405062] env[68906]: ERROR nova.compute.manager [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] self.driver.spawn(context, instance, image_meta, [ 1952.405062] env[68906]: ERROR nova.compute.manager [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1952.405062] env[68906]: ERROR nova.compute.manager [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1952.405062] env[68906]: ERROR nova.compute.manager [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1952.405062] env[68906]: ERROR nova.compute.manager [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] self._fetch_image_if_missing(context, vi) [ 1952.405062] env[68906]: ERROR nova.compute.manager [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1952.405062] env[68906]: ERROR nova.compute.manager [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] image_cache(vi, tmp_image_ds_loc) [ 1952.405062] env[68906]: ERROR nova.compute.manager [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1952.405454] env[68906]: ERROR nova.compute.manager [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] vm_util.copy_virtual_disk( [ 1952.405454] env[68906]: ERROR nova.compute.manager [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1952.405454] env[68906]: ERROR nova.compute.manager [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] session._wait_for_task(vmdk_copy_task) [ 1952.405454] env[68906]: ERROR nova.compute.manager [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1952.405454] env[68906]: ERROR nova.compute.manager [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] return self.wait_for_task(task_ref) [ 1952.405454] env[68906]: ERROR nova.compute.manager [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1952.405454] env[68906]: ERROR nova.compute.manager [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] return evt.wait() [ 1952.405454] env[68906]: ERROR nova.compute.manager [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1952.405454] env[68906]: ERROR nova.compute.manager [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] result = hub.switch() [ 1952.405454] env[68906]: ERROR nova.compute.manager [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1952.405454] env[68906]: ERROR nova.compute.manager [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] return self.greenlet.switch() [ 1952.405454] env[68906]: ERROR nova.compute.manager [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1952.405454] env[68906]: ERROR nova.compute.manager [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] self.f(*self.args, **self.kw) [ 1952.405813] env[68906]: ERROR nova.compute.manager [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1952.405813] env[68906]: ERROR nova.compute.manager [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] raise exceptions.translate_fault(task_info.error) [ 1952.405813] env[68906]: ERROR nova.compute.manager [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1952.405813] env[68906]: ERROR nova.compute.manager [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Faults: ['InvalidArgument'] [ 1952.405813] env[68906]: ERROR nova.compute.manager [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] [ 1952.405813] env[68906]: DEBUG nova.compute.utils [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] VimFaultException {{(pid=68906) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1952.407141] env[68906]: DEBUG nova.compute.manager [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Build of instance aed06616-d008-4695-b66e-9f40acf5ebd3 was re-scheduled: A specified parameter was not correct: fileType [ 1952.407141] env[68906]: Faults: ['InvalidArgument'] {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1952.407514] env[68906]: DEBUG nova.compute.manager [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Unplugging VIFs for instance {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1952.407693] env[68906]: DEBUG nova.compute.manager [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1952.407863] env[68906]: DEBUG nova.compute.manager [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1952.408043] env[68906]: DEBUG nova.network.neutron [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1952.716375] env[68906]: DEBUG nova.network.neutron [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1952.726923] env[68906]: INFO nova.compute.manager [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Took 0.32 seconds to deallocate network for instance. [ 1952.822108] env[68906]: INFO nova.scheduler.client.report [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Deleted allocations for instance aed06616-d008-4695-b66e-9f40acf5ebd3 [ 1952.843676] env[68906]: DEBUG oslo_concurrency.lockutils [None req-1ff0dcfe-1419-4b8a-a4ce-0afaadd97797 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Lock "aed06616-d008-4695-b66e-9f40acf5ebd3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 665.701s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1952.844610] env[68906]: DEBUG oslo_concurrency.lockutils [None req-31d44fc1-fdd4-4a79-b25f-9da7cce5ce50 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Lock "aed06616-d008-4695-b66e-9f40acf5ebd3" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 470.476s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1952.845481] env[68906]: DEBUG oslo_concurrency.lockutils [None req-31d44fc1-fdd4-4a79-b25f-9da7cce5ce50 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Acquiring lock "aed06616-d008-4695-b66e-9f40acf5ebd3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1952.845481] env[68906]: DEBUG oslo_concurrency.lockutils [None req-31d44fc1-fdd4-4a79-b25f-9da7cce5ce50 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Lock "aed06616-d008-4695-b66e-9f40acf5ebd3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1952.845481] env[68906]: DEBUG oslo_concurrency.lockutils [None req-31d44fc1-fdd4-4a79-b25f-9da7cce5ce50 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Lock "aed06616-d008-4695-b66e-9f40acf5ebd3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1952.847880] env[68906]: INFO nova.compute.manager [None req-31d44fc1-fdd4-4a79-b25f-9da7cce5ce50 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Terminating instance [ 1952.849201] env[68906]: DEBUG nova.compute.manager [None req-31d44fc1-fdd4-4a79-b25f-9da7cce5ce50 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1952.849399] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-31d44fc1-fdd4-4a79-b25f-9da7cce5ce50 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1952.849872] env[68906]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-e4a489a2-bae4-4f8c-b575-8e71c69b8385 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1952.859822] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d49c683e-f638-4739-bdca-e41c09078600 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1952.871446] env[68906]: DEBUG nova.compute.manager [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1952.892037] env[68906]: WARNING nova.virt.vmwareapi.vmops [None req-31d44fc1-fdd4-4a79-b25f-9da7cce5ce50 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance aed06616-d008-4695-b66e-9f40acf5ebd3 could not be found. [ 1952.892258] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-31d44fc1-fdd4-4a79-b25f-9da7cce5ce50 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1952.892474] env[68906]: INFO nova.compute.manager [None req-31d44fc1-fdd4-4a79-b25f-9da7cce5ce50 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1952.892723] env[68906]: DEBUG oslo.service.loopingcall [None req-31d44fc1-fdd4-4a79-b25f-9da7cce5ce50 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1952.892952] env[68906]: DEBUG nova.compute.manager [-] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1952.893059] env[68906]: DEBUG nova.network.neutron [-] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1952.918508] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1952.918761] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1952.920205] env[68906]: INFO nova.compute.claims [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1952.922816] env[68906]: DEBUG nova.network.neutron [-] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1952.930527] env[68906]: INFO nova.compute.manager [-] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] Took 0.04 seconds to deallocate network for instance. [ 1953.017949] env[68906]: DEBUG oslo_concurrency.lockutils [None req-31d44fc1-fdd4-4a79-b25f-9da7cce5ce50 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Lock "aed06616-d008-4695-b66e-9f40acf5ebd3" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.173s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1953.018851] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "aed06616-d008-4695-b66e-9f40acf5ebd3" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 129.848s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1953.019055] env[68906]: INFO nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: aed06616-d008-4695-b66e-9f40acf5ebd3] During sync_power_state the instance has a pending task (deleting). Skip. [ 1953.019225] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "aed06616-d008-4695-b66e-9f40acf5ebd3" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1953.104249] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d14584f-d5bb-4339-b331-fc315725b706 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1953.112178] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3201dce3-b92a-4ccc-bb6f-a1ffc09b448b {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1953.141639] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0383c8a5-c203-4a20-9e4e-642d188e173d {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1953.148203] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-51fcd217-5949-434e-98d3-04b5fd47e97d {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1953.160592] env[68906]: DEBUG nova.compute.provider_tree [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1953.169093] env[68906]: DEBUG nova.scheduler.client.report [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1953.185303] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.266s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1953.185788] env[68906]: DEBUG nova.compute.manager [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Start building networks asynchronously for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1953.218390] env[68906]: DEBUG nova.compute.utils [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Using /dev/sd instead of None {{(pid=68906) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1953.219765] env[68906]: DEBUG nova.compute.manager [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Allocating IP information in the background. {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1953.219994] env[68906]: DEBUG nova.network.neutron [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] allocate_for_instance() {{(pid=68906) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1953.228587] env[68906]: DEBUG nova.compute.manager [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Start building block device mappings for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1953.286376] env[68906]: DEBUG nova.policy [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e208107293fd4f82af1f396d43464b69', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '90f212f7916446919081fcdc0527ebb0', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68906) authorize /opt/stack/nova/nova/policy.py:203}} [ 1953.289432] env[68906]: DEBUG nova.compute.manager [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Start spawning the instance on the hypervisor. {{(pid=68906) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1953.312388] env[68906]: DEBUG nova.virt.hardware [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T13:00:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T13:00:23Z,direct_url=,disk_format='vmdk',id=b1400c31-d33b-4e13-944f-4c645e62493e,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='1ae7bf3a375d41c6af5e7536af51ffd1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T13:00:24Z,virtual_size=,visibility=), allow threads: False {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1953.312642] env[68906]: DEBUG nova.virt.hardware [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Flavor limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1953.312803] env[68906]: DEBUG nova.virt.hardware [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Image limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1953.312984] env[68906]: DEBUG nova.virt.hardware [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Flavor pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1953.313148] env[68906]: DEBUG nova.virt.hardware [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Image pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1953.313295] env[68906]: DEBUG nova.virt.hardware [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1953.313503] env[68906]: DEBUG nova.virt.hardware [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1953.313663] env[68906]: DEBUG nova.virt.hardware [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1953.313827] env[68906]: DEBUG nova.virt.hardware [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Got 1 possible topologies {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1953.313987] env[68906]: DEBUG nova.virt.hardware [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1953.314176] env[68906]: DEBUG nova.virt.hardware [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1953.315101] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ad013d8c-6660-4922-b0cf-036fdc9e9ce1 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1953.322702] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-adf0db0d-9594-4ad3-9e81-6f43b7d399b5 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1953.741075] env[68906]: DEBUG nova.network.neutron [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Successfully created port: e6609822-3e9a-46ca-8f1e-667cc84f61b9 {{(pid=68906) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1954.336362] env[68906]: DEBUG nova.network.neutron [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Successfully updated port: e6609822-3e9a-46ca-8f1e-667cc84f61b9 {{(pid=68906) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1954.351478] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Acquiring lock "refresh_cache-8bfc91d4-b1d7-449a-8d48-0e63490fe663" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1954.351660] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Acquired lock "refresh_cache-8bfc91d4-b1d7-449a-8d48-0e63490fe663" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1954.351784] env[68906]: DEBUG nova.network.neutron [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Building network info cache for instance {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1954.386862] env[68906]: DEBUG nova.network.neutron [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Instance cache missing network info. {{(pid=68906) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1954.544673] env[68906]: DEBUG nova.network.neutron [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Updating instance_info_cache with network_info: [{"id": "e6609822-3e9a-46ca-8f1e-667cc84f61b9", "address": "fa:16:3e:ab:1e:36", "network": {"id": "da6ba094-8e2a-4f76-813c-8668f482685b", "bridge": "br-int", "label": "tempest-ServersTestJSON-512380607-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "90f212f7916446919081fcdc0527ebb0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0cd5d325-3053-407e-a4ee-f627e82a23f9", "external-id": "nsx-vlan-transportzone-809", "segmentation_id": 809, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape6609822-3e", "ovs_interfaceid": "e6609822-3e9a-46ca-8f1e-667cc84f61b9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1954.556080] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Releasing lock "refresh_cache-8bfc91d4-b1d7-449a-8d48-0e63490fe663" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1954.556382] env[68906]: DEBUG nova.compute.manager [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Instance network_info: |[{"id": "e6609822-3e9a-46ca-8f1e-667cc84f61b9", "address": "fa:16:3e:ab:1e:36", "network": {"id": "da6ba094-8e2a-4f76-813c-8668f482685b", "bridge": "br-int", "label": "tempest-ServersTestJSON-512380607-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "90f212f7916446919081fcdc0527ebb0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0cd5d325-3053-407e-a4ee-f627e82a23f9", "external-id": "nsx-vlan-transportzone-809", "segmentation_id": 809, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape6609822-3e", "ovs_interfaceid": "e6609822-3e9a-46ca-8f1e-667cc84f61b9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1954.556834] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ab:1e:36', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '0cd5d325-3053-407e-a4ee-f627e82a23f9', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'e6609822-3e9a-46ca-8f1e-667cc84f61b9', 'vif_model': 'vmxnet3'}] {{(pid=68906) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1954.564330] env[68906]: DEBUG oslo.service.loopingcall [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1954.564863] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Creating VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1954.565110] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-f9770796-fd08-487e-843e-42796a029d89 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1954.585991] env[68906]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1954.585991] env[68906]: value = "task-3475451" [ 1954.585991] env[68906]: _type = "Task" [ 1954.585991] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1954.593974] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475451, 'name': CreateVM_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1954.847695] env[68906]: DEBUG nova.compute.manager [req-330f1084-d808-4bb6-8388-ec6ae271e233 req-5e627434-0ca8-4c9b-8605-f44b122dc1ec service nova] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Received event network-vif-plugged-e6609822-3e9a-46ca-8f1e-667cc84f61b9 {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1954.847964] env[68906]: DEBUG oslo_concurrency.lockutils [req-330f1084-d808-4bb6-8388-ec6ae271e233 req-5e627434-0ca8-4c9b-8605-f44b122dc1ec service nova] Acquiring lock "8bfc91d4-b1d7-449a-8d48-0e63490fe663-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1954.848165] env[68906]: DEBUG oslo_concurrency.lockutils [req-330f1084-d808-4bb6-8388-ec6ae271e233 req-5e627434-0ca8-4c9b-8605-f44b122dc1ec service nova] Lock "8bfc91d4-b1d7-449a-8d48-0e63490fe663-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1954.848365] env[68906]: DEBUG oslo_concurrency.lockutils [req-330f1084-d808-4bb6-8388-ec6ae271e233 req-5e627434-0ca8-4c9b-8605-f44b122dc1ec service nova] Lock "8bfc91d4-b1d7-449a-8d48-0e63490fe663-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1954.848550] env[68906]: DEBUG nova.compute.manager [req-330f1084-d808-4bb6-8388-ec6ae271e233 req-5e627434-0ca8-4c9b-8605-f44b122dc1ec service nova] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] No waiting events found dispatching network-vif-plugged-e6609822-3e9a-46ca-8f1e-667cc84f61b9 {{(pid=68906) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1954.848733] env[68906]: WARNING nova.compute.manager [req-330f1084-d808-4bb6-8388-ec6ae271e233 req-5e627434-0ca8-4c9b-8605-f44b122dc1ec service nova] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Received unexpected event network-vif-plugged-e6609822-3e9a-46ca-8f1e-667cc84f61b9 for instance with vm_state building and task_state spawning. [ 1954.848906] env[68906]: DEBUG nova.compute.manager [req-330f1084-d808-4bb6-8388-ec6ae271e233 req-5e627434-0ca8-4c9b-8605-f44b122dc1ec service nova] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Received event network-changed-e6609822-3e9a-46ca-8f1e-667cc84f61b9 {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1954.849076] env[68906]: DEBUG nova.compute.manager [req-330f1084-d808-4bb6-8388-ec6ae271e233 req-5e627434-0ca8-4c9b-8605-f44b122dc1ec service nova] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Refreshing instance network info cache due to event network-changed-e6609822-3e9a-46ca-8f1e-667cc84f61b9. {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1954.849278] env[68906]: DEBUG oslo_concurrency.lockutils [req-330f1084-d808-4bb6-8388-ec6ae271e233 req-5e627434-0ca8-4c9b-8605-f44b122dc1ec service nova] Acquiring lock "refresh_cache-8bfc91d4-b1d7-449a-8d48-0e63490fe663" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1954.849428] env[68906]: DEBUG oslo_concurrency.lockutils [req-330f1084-d808-4bb6-8388-ec6ae271e233 req-5e627434-0ca8-4c9b-8605-f44b122dc1ec service nova] Acquired lock "refresh_cache-8bfc91d4-b1d7-449a-8d48-0e63490fe663" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1954.849592] env[68906]: DEBUG nova.network.neutron [req-330f1084-d808-4bb6-8388-ec6ae271e233 req-5e627434-0ca8-4c9b-8605-f44b122dc1ec service nova] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Refreshing network info cache for port e6609822-3e9a-46ca-8f1e-667cc84f61b9 {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1955.092903] env[68906]: DEBUG nova.network.neutron [req-330f1084-d808-4bb6-8388-ec6ae271e233 req-5e627434-0ca8-4c9b-8605-f44b122dc1ec service nova] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Updated VIF entry in instance network info cache for port e6609822-3e9a-46ca-8f1e-667cc84f61b9. {{(pid=68906) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1955.093273] env[68906]: DEBUG nova.network.neutron [req-330f1084-d808-4bb6-8388-ec6ae271e233 req-5e627434-0ca8-4c9b-8605-f44b122dc1ec service nova] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Updating instance_info_cache with network_info: [{"id": "e6609822-3e9a-46ca-8f1e-667cc84f61b9", "address": "fa:16:3e:ab:1e:36", "network": {"id": "da6ba094-8e2a-4f76-813c-8668f482685b", "bridge": "br-int", "label": "tempest-ServersTestJSON-512380607-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "90f212f7916446919081fcdc0527ebb0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0cd5d325-3053-407e-a4ee-f627e82a23f9", "external-id": "nsx-vlan-transportzone-809", "segmentation_id": 809, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape6609822-3e", "ovs_interfaceid": "e6609822-3e9a-46ca-8f1e-667cc84f61b9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1955.098491] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475451, 'name': CreateVM_Task, 'duration_secs': 0.275284} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1955.098856] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Created VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1955.099487] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1955.099651] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1955.099965] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1955.100243] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0ce9d360-c3ce-4d69-9d22-7eb026d12bd9 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1955.105165] env[68906]: DEBUG oslo_vmware.api [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Waiting for the task: (returnval){ [ 1955.105165] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]523ad9a1-674e-7e9a-a9da-0a25ad7ce0f4" [ 1955.105165] env[68906]: _type = "Task" [ 1955.105165] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1955.109345] env[68906]: DEBUG oslo_concurrency.lockutils [req-330f1084-d808-4bb6-8388-ec6ae271e233 req-5e627434-0ca8-4c9b-8605-f44b122dc1ec service nova] Releasing lock "refresh_cache-8bfc91d4-b1d7-449a-8d48-0e63490fe663" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1955.112963] env[68906]: DEBUG oslo_vmware.api [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]523ad9a1-674e-7e9a-a9da-0a25ad7ce0f4, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1955.615952] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1955.616643] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Processing image b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1955.616993] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1968.533231] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1969.919560] env[68906]: DEBUG oslo_concurrency.lockutils [None req-39b2953e-c229-4fbf-a043-48005b810281 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Acquiring lock "01b79dfa-cd20-495d-b112-8429c28b741e" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1971.140766] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1971.141034] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Starting heal instance info cache {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1971.141080] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Rebuilding the list of instances to heal {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1971.162624] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1971.162833] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1971.162966] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1971.163090] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1971.163217] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1971.163342] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1971.163464] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1971.163583] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1971.163703] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1971.163821] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1971.163943] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Didn't find any instances for network info cache update. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1972.140219] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1974.140164] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1974.140567] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1979.140591] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1979.140856] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1979.140988] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68906) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1983.135996] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1984.140936] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager.update_available_resource {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1984.151957] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1984.152230] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1984.152402] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1984.152556] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68906) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1984.153720] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d025cf3f-ebb1-4867-84ba-ec71a0bccb74 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1984.162794] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-629bfba8-42a3-44f9-a157-03666183278c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1984.176899] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b5e43608-f28d-4e1c-ba41-67749af5cb3d {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1984.183398] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-43940de5-6cbd-44fc-b541-a2adc7fe218c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1984.212270] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180931MB free_disk=93GB free_vcpus=48 pci_devices=None {{(pid=68906) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1984.212417] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1984.212608] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1984.282542] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 17327bc3-433e-4006-93c7-e53714ed70c2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1984.282780] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 32f5b54d-30bf-4fe9-9622-3ff74344b3f3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1984.282946] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 922d81ba-c8d2-43ba-b1c5-f2943418d6a2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1984.283121] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 736db39c-e5e5-4a54-b85a-aa5c703f432e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1984.283285] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance ce6e5cd6-efb8-46d1-811d-74c084661cce actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1984.283408] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 7994d291-b4bf-48f5-ad34-c1f484d77f6e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1984.283529] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 860248ea-e77b-4ff6-af64-b75f88a31348 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1984.283644] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 3cfde5a7-3148-426c-8867-ffafb33dc95b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1984.283758] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 01b79dfa-cd20-495d-b112-8429c28b741e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1984.283872] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 8bfc91d4-b1d7-449a-8d48-0e63490fe663 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1984.294687] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance d70b039d-c8ad-4ffd-84f8-08f17cb97578 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1984.304890] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance ed276c3c-6085-427d-b3b7-86bbb8660dbc has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1984.305143] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1984.305315] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1984.443750] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fccdd03d-0f26-41cc-a293-6aad59894281 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1984.451356] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-64ca349c-fcbf-4531-860a-5224747a3893 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1984.480412] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c4a549d-3709-457f-9682-22b5b565562b {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1984.487255] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-13fd6b07-cbb6-4b14-9a9d-29935151af40 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1984.500016] env[68906]: DEBUG nova.compute.provider_tree [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1984.508614] env[68906]: DEBUG nova.scheduler.client.report [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1984.522772] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68906) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1984.523011] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.310s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2000.548791] env[68906]: WARNING oslo_vmware.rw_handles [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2000.548791] env[68906]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2000.548791] env[68906]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2000.548791] env[68906]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2000.548791] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2000.548791] env[68906]: ERROR oslo_vmware.rw_handles response.begin() [ 2000.548791] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2000.548791] env[68906]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2000.548791] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2000.548791] env[68906]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2000.548791] env[68906]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2000.548791] env[68906]: ERROR oslo_vmware.rw_handles [ 2000.549516] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Downloaded image file data b1400c31-d33b-4e13-944f-4c645e62493e to vmware_temp/faee50ba-fae6-40d2-ac55-b02fd17cdf54/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2000.551299] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Caching image {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2000.551532] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Copying Virtual Disk [datastore2] vmware_temp/faee50ba-fae6-40d2-ac55-b02fd17cdf54/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk to [datastore2] vmware_temp/faee50ba-fae6-40d2-ac55-b02fd17cdf54/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk {{(pid=68906) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2000.551818] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-49cf512d-7602-4e8f-b1fb-452720f1b70d {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2000.559867] env[68906]: DEBUG oslo_vmware.api [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Waiting for the task: (returnval){ [ 2000.559867] env[68906]: value = "task-3475452" [ 2000.559867] env[68906]: _type = "Task" [ 2000.559867] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2000.567781] env[68906]: DEBUG oslo_vmware.api [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Task: {'id': task-3475452, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2001.070631] env[68906]: DEBUG oslo_vmware.exceptions [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Fault InvalidArgument not matched. {{(pid=68906) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2001.070920] env[68906]: DEBUG oslo_concurrency.lockutils [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2001.071513] env[68906]: ERROR nova.compute.manager [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2001.071513] env[68906]: Faults: ['InvalidArgument'] [ 2001.071513] env[68906]: ERROR nova.compute.manager [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Traceback (most recent call last): [ 2001.071513] env[68906]: ERROR nova.compute.manager [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2001.071513] env[68906]: ERROR nova.compute.manager [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] yield resources [ 2001.071513] env[68906]: ERROR nova.compute.manager [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2001.071513] env[68906]: ERROR nova.compute.manager [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] self.driver.spawn(context, instance, image_meta, [ 2001.071513] env[68906]: ERROR nova.compute.manager [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2001.071513] env[68906]: ERROR nova.compute.manager [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2001.071513] env[68906]: ERROR nova.compute.manager [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2001.071513] env[68906]: ERROR nova.compute.manager [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] self._fetch_image_if_missing(context, vi) [ 2001.071513] env[68906]: ERROR nova.compute.manager [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2001.071900] env[68906]: ERROR nova.compute.manager [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] image_cache(vi, tmp_image_ds_loc) [ 2001.071900] env[68906]: ERROR nova.compute.manager [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2001.071900] env[68906]: ERROR nova.compute.manager [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] vm_util.copy_virtual_disk( [ 2001.071900] env[68906]: ERROR nova.compute.manager [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2001.071900] env[68906]: ERROR nova.compute.manager [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] session._wait_for_task(vmdk_copy_task) [ 2001.071900] env[68906]: ERROR nova.compute.manager [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2001.071900] env[68906]: ERROR nova.compute.manager [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] return self.wait_for_task(task_ref) [ 2001.071900] env[68906]: ERROR nova.compute.manager [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2001.071900] env[68906]: ERROR nova.compute.manager [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] return evt.wait() [ 2001.071900] env[68906]: ERROR nova.compute.manager [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2001.071900] env[68906]: ERROR nova.compute.manager [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] result = hub.switch() [ 2001.071900] env[68906]: ERROR nova.compute.manager [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2001.071900] env[68906]: ERROR nova.compute.manager [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] return self.greenlet.switch() [ 2001.072302] env[68906]: ERROR nova.compute.manager [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2001.072302] env[68906]: ERROR nova.compute.manager [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] self.f(*self.args, **self.kw) [ 2001.072302] env[68906]: ERROR nova.compute.manager [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2001.072302] env[68906]: ERROR nova.compute.manager [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] raise exceptions.translate_fault(task_info.error) [ 2001.072302] env[68906]: ERROR nova.compute.manager [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2001.072302] env[68906]: ERROR nova.compute.manager [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Faults: ['InvalidArgument'] [ 2001.072302] env[68906]: ERROR nova.compute.manager [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] [ 2001.072302] env[68906]: INFO nova.compute.manager [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Terminating instance [ 2001.073465] env[68906]: DEBUG oslo_concurrency.lockutils [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2001.073676] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2001.073900] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4306ffa8-4770-48a3-9a91-fa30bc19eba3 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2001.077400] env[68906]: DEBUG nova.compute.manager [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2001.077596] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2001.078341] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-975fb83e-2979-4efc-af5f-c51b8b2bc94f {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2001.084764] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Unregistering the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2001.084970] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-d5e02f1d-c6e7-4cf2-882b-47006e9d7254 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2001.087089] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2001.087264] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68906) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2001.088236] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-da9a961c-49e0-4b75-aeb8-3a195f4425dd {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2001.093052] env[68906]: DEBUG oslo_vmware.api [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Waiting for the task: (returnval){ [ 2001.093052] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]52ae4a30-58ed-318a-8466-f31925bb0556" [ 2001.093052] env[68906]: _type = "Task" [ 2001.093052] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2001.099833] env[68906]: DEBUG oslo_vmware.api [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]52ae4a30-58ed-318a-8466-f31925bb0556, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2001.165144] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Unregistered the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2001.165397] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Deleting contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2001.165644] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Deleting the datastore file [datastore2] 17327bc3-433e-4006-93c7-e53714ed70c2 {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2001.165961] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-d00f01ce-bd72-4e84-85d7-e96fb25a6a02 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2001.171947] env[68906]: DEBUG oslo_vmware.api [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Waiting for the task: (returnval){ [ 2001.171947] env[68906]: value = "task-3475454" [ 2001.171947] env[68906]: _type = "Task" [ 2001.171947] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2001.179702] env[68906]: DEBUG oslo_vmware.api [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Task: {'id': task-3475454, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2001.603980] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Preparing fetch location {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2001.604279] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Creating directory with path [datastore2] vmware_temp/b55717de-c44e-4111-b3ad-494e49989dba/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2001.604511] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e3f0d5f4-554b-4933-8a4e-c3903e260862 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2001.614983] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Created directory with path [datastore2] vmware_temp/b55717de-c44e-4111-b3ad-494e49989dba/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2001.615185] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Fetch image to [datastore2] vmware_temp/b55717de-c44e-4111-b3ad-494e49989dba/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2001.615358] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to [datastore2] vmware_temp/b55717de-c44e-4111-b3ad-494e49989dba/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2001.616126] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b2495a21-3407-4377-bc50-09ab60fcfc65 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2001.622444] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cdb3cebc-5fda-4263-a278-2f08f7e72d11 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2001.631118] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-04f1b9b7-983f-46bc-9253-d1d3038773c7 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2001.678384] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6dd3210e-9f95-4332-a6a5-245fe303d85b {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2001.686477] env[68906]: DEBUG oslo_vmware.api [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Task: {'id': task-3475454, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.065077} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2001.687923] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Deleted the datastore file {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2001.688132] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Deleted contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2001.688312] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2001.688485] env[68906]: INFO nova.compute.manager [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Took 0.61 seconds to destroy the instance on the hypervisor. [ 2001.690241] env[68906]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-129b6e20-a619-4ae2-836d-3a04da224428 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2001.692070] env[68906]: DEBUG nova.compute.claims [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Aborting claim: {{(pid=68906) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2001.692249] env[68906]: DEBUG oslo_concurrency.lockutils [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2001.692472] env[68906]: DEBUG oslo_concurrency.lockutils [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2001.715622] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2001.766660] env[68906]: DEBUG oslo_vmware.rw_handles [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b55717de-c44e-4111-b3ad-494e49989dba/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2001.826280] env[68906]: DEBUG oslo_vmware.rw_handles [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Completed reading data from the image iterator. {{(pid=68906) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2001.826470] env[68906]: DEBUG oslo_vmware.rw_handles [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b55717de-c44e-4111-b3ad-494e49989dba/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2001.936307] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-538018e3-55dd-4b98-9856-32d226ac7983 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2001.943752] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-99f4dfea-0fd8-4aa1-b7e2-cfee5895044d {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2001.973989] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79fa1067-186a-4644-bb59-b0225c75e2e0 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2001.981157] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e0e12c3-73bf-4662-a8ef-092790d2a869 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2001.994054] env[68906]: DEBUG nova.compute.provider_tree [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2002.003953] env[68906]: DEBUG nova.scheduler.client.report [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2002.018817] env[68906]: DEBUG oslo_concurrency.lockutils [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.326s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2002.019378] env[68906]: ERROR nova.compute.manager [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2002.019378] env[68906]: Faults: ['InvalidArgument'] [ 2002.019378] env[68906]: ERROR nova.compute.manager [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Traceback (most recent call last): [ 2002.019378] env[68906]: ERROR nova.compute.manager [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2002.019378] env[68906]: ERROR nova.compute.manager [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] self.driver.spawn(context, instance, image_meta, [ 2002.019378] env[68906]: ERROR nova.compute.manager [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2002.019378] env[68906]: ERROR nova.compute.manager [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2002.019378] env[68906]: ERROR nova.compute.manager [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2002.019378] env[68906]: ERROR nova.compute.manager [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] self._fetch_image_if_missing(context, vi) [ 2002.019378] env[68906]: ERROR nova.compute.manager [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2002.019378] env[68906]: ERROR nova.compute.manager [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] image_cache(vi, tmp_image_ds_loc) [ 2002.019378] env[68906]: ERROR nova.compute.manager [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2002.019805] env[68906]: ERROR nova.compute.manager [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] vm_util.copy_virtual_disk( [ 2002.019805] env[68906]: ERROR nova.compute.manager [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2002.019805] env[68906]: ERROR nova.compute.manager [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] session._wait_for_task(vmdk_copy_task) [ 2002.019805] env[68906]: ERROR nova.compute.manager [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2002.019805] env[68906]: ERROR nova.compute.manager [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] return self.wait_for_task(task_ref) [ 2002.019805] env[68906]: ERROR nova.compute.manager [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2002.019805] env[68906]: ERROR nova.compute.manager [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] return evt.wait() [ 2002.019805] env[68906]: ERROR nova.compute.manager [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2002.019805] env[68906]: ERROR nova.compute.manager [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] result = hub.switch() [ 2002.019805] env[68906]: ERROR nova.compute.manager [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2002.019805] env[68906]: ERROR nova.compute.manager [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] return self.greenlet.switch() [ 2002.019805] env[68906]: ERROR nova.compute.manager [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2002.019805] env[68906]: ERROR nova.compute.manager [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] self.f(*self.args, **self.kw) [ 2002.020360] env[68906]: ERROR nova.compute.manager [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2002.020360] env[68906]: ERROR nova.compute.manager [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] raise exceptions.translate_fault(task_info.error) [ 2002.020360] env[68906]: ERROR nova.compute.manager [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2002.020360] env[68906]: ERROR nova.compute.manager [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Faults: ['InvalidArgument'] [ 2002.020360] env[68906]: ERROR nova.compute.manager [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] [ 2002.020360] env[68906]: DEBUG nova.compute.utils [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] VimFaultException {{(pid=68906) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2002.021501] env[68906]: DEBUG nova.compute.manager [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Build of instance 17327bc3-433e-4006-93c7-e53714ed70c2 was re-scheduled: A specified parameter was not correct: fileType [ 2002.021501] env[68906]: Faults: ['InvalidArgument'] {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2002.021862] env[68906]: DEBUG nova.compute.manager [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Unplugging VIFs for instance {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2002.022051] env[68906]: DEBUG nova.compute.manager [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2002.022228] env[68906]: DEBUG nova.compute.manager [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2002.022391] env[68906]: DEBUG nova.network.neutron [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2002.467962] env[68906]: DEBUG nova.network.neutron [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2002.485083] env[68906]: INFO nova.compute.manager [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Took 0.46 seconds to deallocate network for instance. [ 2002.583076] env[68906]: INFO nova.scheduler.client.report [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Deleted allocations for instance 17327bc3-433e-4006-93c7-e53714ed70c2 [ 2002.605571] env[68906]: DEBUG oslo_concurrency.lockutils [None req-b9080d5d-4234-4166-bade-e67041a2b08f tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Lock "17327bc3-433e-4006-93c7-e53714ed70c2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 626.412s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2002.606953] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6a92d79c-99f6-49e2-a1ce-a661d04c8903 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Lock "17327bc3-433e-4006-93c7-e53714ed70c2" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 430.645s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2002.607218] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6a92d79c-99f6-49e2-a1ce-a661d04c8903 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Acquiring lock "17327bc3-433e-4006-93c7-e53714ed70c2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2002.607434] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6a92d79c-99f6-49e2-a1ce-a661d04c8903 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Lock "17327bc3-433e-4006-93c7-e53714ed70c2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2002.607604] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6a92d79c-99f6-49e2-a1ce-a661d04c8903 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Lock "17327bc3-433e-4006-93c7-e53714ed70c2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2002.610220] env[68906]: INFO nova.compute.manager [None req-6a92d79c-99f6-49e2-a1ce-a661d04c8903 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Terminating instance [ 2002.611786] env[68906]: DEBUG nova.compute.manager [None req-6a92d79c-99f6-49e2-a1ce-a661d04c8903 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2002.612063] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-6a92d79c-99f6-49e2-a1ce-a661d04c8903 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2002.612549] env[68906]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-c4c67d0f-d8be-4a4e-9c44-a5a85421e5d9 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2002.619093] env[68906]: DEBUG nova.compute.manager [None req-4077b494-948d-40fb-ba1f-1a5eae01fbe0 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: d70b039d-c8ad-4ffd-84f8-08f17cb97578] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2002.628356] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7b6ea051-8d6e-4668-a024-98e4ed1a3718 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2002.659667] env[68906]: WARNING nova.virt.vmwareapi.vmops [None req-6a92d79c-99f6-49e2-a1ce-a661d04c8903 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 17327bc3-433e-4006-93c7-e53714ed70c2 could not be found. [ 2002.659906] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-6a92d79c-99f6-49e2-a1ce-a661d04c8903 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2002.660132] env[68906]: INFO nova.compute.manager [None req-6a92d79c-99f6-49e2-a1ce-a661d04c8903 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Took 0.05 seconds to destroy the instance on the hypervisor. [ 2002.660401] env[68906]: DEBUG oslo.service.loopingcall [None req-6a92d79c-99f6-49e2-a1ce-a661d04c8903 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2002.661256] env[68906]: DEBUG nova.compute.manager [-] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2002.661382] env[68906]: DEBUG nova.network.neutron [-] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2002.678249] env[68906]: DEBUG oslo_concurrency.lockutils [None req-4077b494-948d-40fb-ba1f-1a5eae01fbe0 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2002.678503] env[68906]: DEBUG oslo_concurrency.lockutils [None req-4077b494-948d-40fb-ba1f-1a5eae01fbe0 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2002.679974] env[68906]: INFO nova.compute.claims [None req-4077b494-948d-40fb-ba1f-1a5eae01fbe0 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: d70b039d-c8ad-4ffd-84f8-08f17cb97578] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2002.691665] env[68906]: DEBUG nova.network.neutron [-] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2002.701441] env[68906]: INFO nova.compute.manager [-] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] Took 0.04 seconds to deallocate network for instance. [ 2002.790146] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6a92d79c-99f6-49e2-a1ce-a661d04c8903 tempest-AttachInterfacesTestJSON-916650638 tempest-AttachInterfacesTestJSON-916650638-project-member] Lock "17327bc3-433e-4006-93c7-e53714ed70c2" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.183s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2002.790960] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "17327bc3-433e-4006-93c7-e53714ed70c2" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 179.620s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2002.791169] env[68906]: INFO nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 17327bc3-433e-4006-93c7-e53714ed70c2] During sync_power_state the instance has a pending task (deleting). Skip. [ 2002.791343] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "17327bc3-433e-4006-93c7-e53714ed70c2" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2002.861108] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0c63526-4d65-439a-a3f6-6ee4463b49dd {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2002.868388] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f415de81-cbd5-4870-bf21-02219d953508 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2002.898958] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c4928ec-b266-451f-a996-d19f07496ead {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2002.905795] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed9d231e-ff59-455f-a1a7-7135f5e5c6eb {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2002.919007] env[68906]: DEBUG nova.compute.provider_tree [None req-4077b494-948d-40fb-ba1f-1a5eae01fbe0 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2002.928048] env[68906]: DEBUG nova.scheduler.client.report [None req-4077b494-948d-40fb-ba1f-1a5eae01fbe0 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2002.941353] env[68906]: DEBUG oslo_concurrency.lockutils [None req-4077b494-948d-40fb-ba1f-1a5eae01fbe0 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.263s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2002.941817] env[68906]: DEBUG nova.compute.manager [None req-4077b494-948d-40fb-ba1f-1a5eae01fbe0 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: d70b039d-c8ad-4ffd-84f8-08f17cb97578] Start building networks asynchronously for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 2002.974056] env[68906]: DEBUG nova.compute.utils [None req-4077b494-948d-40fb-ba1f-1a5eae01fbe0 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Using /dev/sd instead of None {{(pid=68906) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2002.975986] env[68906]: DEBUG nova.compute.manager [None req-4077b494-948d-40fb-ba1f-1a5eae01fbe0 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: d70b039d-c8ad-4ffd-84f8-08f17cb97578] Allocating IP information in the background. {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2002.976199] env[68906]: DEBUG nova.network.neutron [None req-4077b494-948d-40fb-ba1f-1a5eae01fbe0 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: d70b039d-c8ad-4ffd-84f8-08f17cb97578] allocate_for_instance() {{(pid=68906) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2002.984733] env[68906]: DEBUG nova.compute.manager [None req-4077b494-948d-40fb-ba1f-1a5eae01fbe0 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: d70b039d-c8ad-4ffd-84f8-08f17cb97578] Start building block device mappings for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 2003.039843] env[68906]: DEBUG nova.policy [None req-4077b494-948d-40fb-ba1f-1a5eae01fbe0 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '97b90d0bf6244d02bb9f4133aa781bd8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '530f8be6c3934b3aa339c5c3e09cf9d9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68906) authorize /opt/stack/nova/nova/policy.py:203}} [ 2003.048538] env[68906]: DEBUG nova.compute.manager [None req-4077b494-948d-40fb-ba1f-1a5eae01fbe0 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: d70b039d-c8ad-4ffd-84f8-08f17cb97578] Start spawning the instance on the hypervisor. {{(pid=68906) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 2003.075216] env[68906]: DEBUG nova.virt.hardware [None req-4077b494-948d-40fb-ba1f-1a5eae01fbe0 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T13:00:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T13:00:23Z,direct_url=,disk_format='vmdk',id=b1400c31-d33b-4e13-944f-4c645e62493e,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='1ae7bf3a375d41c6af5e7536af51ffd1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T13:00:24Z,virtual_size=,visibility=), allow threads: False {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2003.075616] env[68906]: DEBUG nova.virt.hardware [None req-4077b494-948d-40fb-ba1f-1a5eae01fbe0 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Flavor limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2003.075843] env[68906]: DEBUG nova.virt.hardware [None req-4077b494-948d-40fb-ba1f-1a5eae01fbe0 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Image limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2003.076230] env[68906]: DEBUG nova.virt.hardware [None req-4077b494-948d-40fb-ba1f-1a5eae01fbe0 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Flavor pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2003.076402] env[68906]: DEBUG nova.virt.hardware [None req-4077b494-948d-40fb-ba1f-1a5eae01fbe0 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Image pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2003.076561] env[68906]: DEBUG nova.virt.hardware [None req-4077b494-948d-40fb-ba1f-1a5eae01fbe0 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2003.076834] env[68906]: DEBUG nova.virt.hardware [None req-4077b494-948d-40fb-ba1f-1a5eae01fbe0 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2003.077055] env[68906]: DEBUG nova.virt.hardware [None req-4077b494-948d-40fb-ba1f-1a5eae01fbe0 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2003.077243] env[68906]: DEBUG nova.virt.hardware [None req-4077b494-948d-40fb-ba1f-1a5eae01fbe0 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Got 1 possible topologies {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2003.077415] env[68906]: DEBUG nova.virt.hardware [None req-4077b494-948d-40fb-ba1f-1a5eae01fbe0 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2003.077593] env[68906]: DEBUG nova.virt.hardware [None req-4077b494-948d-40fb-ba1f-1a5eae01fbe0 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2003.079046] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2998b585-87e9-4cb2-9c53-12a936aa1eaa {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2003.087618] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-059cd521-bac0-4785-b542-8fb9472c2b8f {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2003.351617] env[68906]: DEBUG nova.network.neutron [None req-4077b494-948d-40fb-ba1f-1a5eae01fbe0 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: d70b039d-c8ad-4ffd-84f8-08f17cb97578] Successfully created port: 7f55a3f9-039d-4d44-adbc-e6517161be50 {{(pid=68906) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2003.964849] env[68906]: DEBUG nova.network.neutron [None req-4077b494-948d-40fb-ba1f-1a5eae01fbe0 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: d70b039d-c8ad-4ffd-84f8-08f17cb97578] Successfully updated port: 7f55a3f9-039d-4d44-adbc-e6517161be50 {{(pid=68906) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2003.975473] env[68906]: DEBUG oslo_concurrency.lockutils [None req-4077b494-948d-40fb-ba1f-1a5eae01fbe0 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Acquiring lock "refresh_cache-d70b039d-c8ad-4ffd-84f8-08f17cb97578" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2003.975629] env[68906]: DEBUG oslo_concurrency.lockutils [None req-4077b494-948d-40fb-ba1f-1a5eae01fbe0 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Acquired lock "refresh_cache-d70b039d-c8ad-4ffd-84f8-08f17cb97578" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2003.975812] env[68906]: DEBUG nova.network.neutron [None req-4077b494-948d-40fb-ba1f-1a5eae01fbe0 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: d70b039d-c8ad-4ffd-84f8-08f17cb97578] Building network info cache for instance {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2004.020956] env[68906]: DEBUG nova.network.neutron [None req-4077b494-948d-40fb-ba1f-1a5eae01fbe0 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: d70b039d-c8ad-4ffd-84f8-08f17cb97578] Instance cache missing network info. {{(pid=68906) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2004.175816] env[68906]: DEBUG nova.network.neutron [None req-4077b494-948d-40fb-ba1f-1a5eae01fbe0 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: d70b039d-c8ad-4ffd-84f8-08f17cb97578] Updating instance_info_cache with network_info: [{"id": "7f55a3f9-039d-4d44-adbc-e6517161be50", "address": "fa:16:3e:7f:1d:93", "network": {"id": "fbd576ff-c6cf-4609-ba79-251b3480702c", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1924323004-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "530f8be6c3934b3aa339c5c3e09cf9d9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8fedd232-bfc1-4e7f-bd5e-c43ef8f2f08a", "external-id": "nsx-vlan-transportzone-925", "segmentation_id": 925, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7f55a3f9-03", "ovs_interfaceid": "7f55a3f9-039d-4d44-adbc-e6517161be50", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2004.188742] env[68906]: DEBUG oslo_concurrency.lockutils [None req-4077b494-948d-40fb-ba1f-1a5eae01fbe0 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Releasing lock "refresh_cache-d70b039d-c8ad-4ffd-84f8-08f17cb97578" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2004.189077] env[68906]: DEBUG nova.compute.manager [None req-4077b494-948d-40fb-ba1f-1a5eae01fbe0 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: d70b039d-c8ad-4ffd-84f8-08f17cb97578] Instance network_info: |[{"id": "7f55a3f9-039d-4d44-adbc-e6517161be50", "address": "fa:16:3e:7f:1d:93", "network": {"id": "fbd576ff-c6cf-4609-ba79-251b3480702c", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1924323004-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "530f8be6c3934b3aa339c5c3e09cf9d9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8fedd232-bfc1-4e7f-bd5e-c43ef8f2f08a", "external-id": "nsx-vlan-transportzone-925", "segmentation_id": 925, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7f55a3f9-03", "ovs_interfaceid": "7f55a3f9-039d-4d44-adbc-e6517161be50", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2004.189494] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-4077b494-948d-40fb-ba1f-1a5eae01fbe0 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: d70b039d-c8ad-4ffd-84f8-08f17cb97578] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:7f:1d:93', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '8fedd232-bfc1-4e7f-bd5e-c43ef8f2f08a', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '7f55a3f9-039d-4d44-adbc-e6517161be50', 'vif_model': 'vmxnet3'}] {{(pid=68906) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2004.197233] env[68906]: DEBUG oslo.service.loopingcall [None req-4077b494-948d-40fb-ba1f-1a5eae01fbe0 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2004.197683] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d70b039d-c8ad-4ffd-84f8-08f17cb97578] Creating VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2004.197905] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-81c516a7-f2a2-4d5b-bc5d-bef8944cf507 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2004.218755] env[68906]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2004.218755] env[68906]: value = "task-3475455" [ 2004.218755] env[68906]: _type = "Task" [ 2004.218755] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2004.226634] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475455, 'name': CreateVM_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2004.505737] env[68906]: DEBUG nova.compute.manager [req-dfba73de-6081-4c4c-9f3d-5687d1fdd441 req-04c16ccd-33dc-4591-b80b-925ccd6a0e2d service nova] [instance: d70b039d-c8ad-4ffd-84f8-08f17cb97578] Received event network-vif-plugged-7f55a3f9-039d-4d44-adbc-e6517161be50 {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2004.505966] env[68906]: DEBUG oslo_concurrency.lockutils [req-dfba73de-6081-4c4c-9f3d-5687d1fdd441 req-04c16ccd-33dc-4591-b80b-925ccd6a0e2d service nova] Acquiring lock "d70b039d-c8ad-4ffd-84f8-08f17cb97578-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2004.506220] env[68906]: DEBUG oslo_concurrency.lockutils [req-dfba73de-6081-4c4c-9f3d-5687d1fdd441 req-04c16ccd-33dc-4591-b80b-925ccd6a0e2d service nova] Lock "d70b039d-c8ad-4ffd-84f8-08f17cb97578-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2004.506408] env[68906]: DEBUG oslo_concurrency.lockutils [req-dfba73de-6081-4c4c-9f3d-5687d1fdd441 req-04c16ccd-33dc-4591-b80b-925ccd6a0e2d service nova] Lock "d70b039d-c8ad-4ffd-84f8-08f17cb97578-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2004.506576] env[68906]: DEBUG nova.compute.manager [req-dfba73de-6081-4c4c-9f3d-5687d1fdd441 req-04c16ccd-33dc-4591-b80b-925ccd6a0e2d service nova] [instance: d70b039d-c8ad-4ffd-84f8-08f17cb97578] No waiting events found dispatching network-vif-plugged-7f55a3f9-039d-4d44-adbc-e6517161be50 {{(pid=68906) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2004.506749] env[68906]: WARNING nova.compute.manager [req-dfba73de-6081-4c4c-9f3d-5687d1fdd441 req-04c16ccd-33dc-4591-b80b-925ccd6a0e2d service nova] [instance: d70b039d-c8ad-4ffd-84f8-08f17cb97578] Received unexpected event network-vif-plugged-7f55a3f9-039d-4d44-adbc-e6517161be50 for instance with vm_state building and task_state spawning. [ 2004.506940] env[68906]: DEBUG nova.compute.manager [req-dfba73de-6081-4c4c-9f3d-5687d1fdd441 req-04c16ccd-33dc-4591-b80b-925ccd6a0e2d service nova] [instance: d70b039d-c8ad-4ffd-84f8-08f17cb97578] Received event network-changed-7f55a3f9-039d-4d44-adbc-e6517161be50 {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2004.507314] env[68906]: DEBUG nova.compute.manager [req-dfba73de-6081-4c4c-9f3d-5687d1fdd441 req-04c16ccd-33dc-4591-b80b-925ccd6a0e2d service nova] [instance: d70b039d-c8ad-4ffd-84f8-08f17cb97578] Refreshing instance network info cache due to event network-changed-7f55a3f9-039d-4d44-adbc-e6517161be50. {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 2004.507512] env[68906]: DEBUG oslo_concurrency.lockutils [req-dfba73de-6081-4c4c-9f3d-5687d1fdd441 req-04c16ccd-33dc-4591-b80b-925ccd6a0e2d service nova] Acquiring lock "refresh_cache-d70b039d-c8ad-4ffd-84f8-08f17cb97578" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2004.507650] env[68906]: DEBUG oslo_concurrency.lockutils [req-dfba73de-6081-4c4c-9f3d-5687d1fdd441 req-04c16ccd-33dc-4591-b80b-925ccd6a0e2d service nova] Acquired lock "refresh_cache-d70b039d-c8ad-4ffd-84f8-08f17cb97578" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2004.507806] env[68906]: DEBUG nova.network.neutron [req-dfba73de-6081-4c4c-9f3d-5687d1fdd441 req-04c16ccd-33dc-4591-b80b-925ccd6a0e2d service nova] [instance: d70b039d-c8ad-4ffd-84f8-08f17cb97578] Refreshing network info cache for port 7f55a3f9-039d-4d44-adbc-e6517161be50 {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2004.728823] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475455, 'name': CreateVM_Task, 'duration_secs': 0.332596} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2004.729026] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d70b039d-c8ad-4ffd-84f8-08f17cb97578] Created VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2004.729666] env[68906]: DEBUG oslo_concurrency.lockutils [None req-4077b494-948d-40fb-ba1f-1a5eae01fbe0 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2004.729833] env[68906]: DEBUG oslo_concurrency.lockutils [None req-4077b494-948d-40fb-ba1f-1a5eae01fbe0 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2004.730189] env[68906]: DEBUG oslo_concurrency.lockutils [None req-4077b494-948d-40fb-ba1f-1a5eae01fbe0 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2004.730443] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d5a970e6-bbec-434b-b33d-24101238b5e1 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2004.734456] env[68906]: DEBUG oslo_vmware.api [None req-4077b494-948d-40fb-ba1f-1a5eae01fbe0 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Waiting for the task: (returnval){ [ 2004.734456] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]5264c37d-ea54-1d66-9acc-8027c71f676b" [ 2004.734456] env[68906]: _type = "Task" [ 2004.734456] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2004.741555] env[68906]: DEBUG oslo_vmware.api [None req-4077b494-948d-40fb-ba1f-1a5eae01fbe0 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]5264c37d-ea54-1d66-9acc-8027c71f676b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2004.742292] env[68906]: DEBUG nova.network.neutron [req-dfba73de-6081-4c4c-9f3d-5687d1fdd441 req-04c16ccd-33dc-4591-b80b-925ccd6a0e2d service nova] [instance: d70b039d-c8ad-4ffd-84f8-08f17cb97578] Updated VIF entry in instance network info cache for port 7f55a3f9-039d-4d44-adbc-e6517161be50. {{(pid=68906) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2004.742603] env[68906]: DEBUG nova.network.neutron [req-dfba73de-6081-4c4c-9f3d-5687d1fdd441 req-04c16ccd-33dc-4591-b80b-925ccd6a0e2d service nova] [instance: d70b039d-c8ad-4ffd-84f8-08f17cb97578] Updating instance_info_cache with network_info: [{"id": "7f55a3f9-039d-4d44-adbc-e6517161be50", "address": "fa:16:3e:7f:1d:93", "network": {"id": "fbd576ff-c6cf-4609-ba79-251b3480702c", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1924323004-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "530f8be6c3934b3aa339c5c3e09cf9d9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8fedd232-bfc1-4e7f-bd5e-c43ef8f2f08a", "external-id": "nsx-vlan-transportzone-925", "segmentation_id": 925, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap7f55a3f9-03", "ovs_interfaceid": "7f55a3f9-039d-4d44-adbc-e6517161be50", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2004.751502] env[68906]: DEBUG oslo_concurrency.lockutils [req-dfba73de-6081-4c4c-9f3d-5687d1fdd441 req-04c16ccd-33dc-4591-b80b-925ccd6a0e2d service nova] Releasing lock "refresh_cache-d70b039d-c8ad-4ffd-84f8-08f17cb97578" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2005.244877] env[68906]: DEBUG oslo_concurrency.lockutils [None req-4077b494-948d-40fb-ba1f-1a5eae01fbe0 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2005.245203] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-4077b494-948d-40fb-ba1f-1a5eae01fbe0 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: d70b039d-c8ad-4ffd-84f8-08f17cb97578] Processing image b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2005.245344] env[68906]: DEBUG oslo_concurrency.lockutils [None req-4077b494-948d-40fb-ba1f-1a5eae01fbe0 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2011.334345] env[68906]: DEBUG oslo_concurrency.lockutils [None req-e3d0afc1-6979-43e2-885d-82661245d8f2 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Acquiring lock "8bfc91d4-b1d7-449a-8d48-0e63490fe663" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2028.524805] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2033.141220] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2033.141533] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Starting heal instance info cache {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2033.141683] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Rebuilding the list of instances to heal {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2033.164835] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2033.164994] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2033.165120] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2033.165247] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2033.165372] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2033.165498] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2033.165620] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2033.165748] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2033.165896] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2033.166030] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: d70b039d-c8ad-4ffd-84f8-08f17cb97578] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2033.166156] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Didn't find any instances for network info cache update. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2033.166654] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2035.140613] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2036.141044] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2038.135458] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2039.140442] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2039.140759] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2039.140864] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68906) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2040.987566] env[68906]: DEBUG oslo_concurrency.lockutils [None req-c3c64d30-7f72-4792-bf2a-8b78bfcdc58b tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Acquiring lock "d70b039d-c8ad-4ffd-84f8-08f17cb97578" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2044.140592] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager.update_available_resource {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2044.151520] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2044.151828] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2044.151928] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2044.152095] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68906) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2044.153250] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42c1bcea-255f-4efe-93a1-4961ad4f9aed {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2044.162160] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-127d39ba-e023-4848-9165-e663ead1ef1d {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2044.176127] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-14c1130a-0b7c-4bab-b179-629e0ec3c610 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2044.182430] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-34d4ebfd-6544-474d-9de0-c670c43cc153 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2044.212029] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180921MB free_disk=93GB free_vcpus=48 pci_devices=None {{(pid=68906) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2044.212201] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2044.212442] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2044.285185] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 32f5b54d-30bf-4fe9-9622-3ff74344b3f3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2044.285928] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 922d81ba-c8d2-43ba-b1c5-f2943418d6a2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2044.286143] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 736db39c-e5e5-4a54-b85a-aa5c703f432e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2044.286296] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance ce6e5cd6-efb8-46d1-811d-74c084661cce actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2044.286422] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 7994d291-b4bf-48f5-ad34-c1f484d77f6e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2044.286582] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 860248ea-e77b-4ff6-af64-b75f88a31348 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2044.286663] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 3cfde5a7-3148-426c-8867-ffafb33dc95b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2044.286770] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 01b79dfa-cd20-495d-b112-8429c28b741e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2044.286903] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 8bfc91d4-b1d7-449a-8d48-0e63490fe663 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2044.286991] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance d70b039d-c8ad-4ffd-84f8-08f17cb97578 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2044.301246] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance ed276c3c-6085-427d-b3b7-86bbb8660dbc has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 2044.301246] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2044.301246] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2044.428431] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-768d0c5f-1f7d-4881-aab3-4e28c5b56bfd {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2044.436101] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9476b651-9426-42ae-beac-c1f788fca37f {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2044.470441] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29823d9a-bd6d-4abe-8830-63558d2ba520 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2044.477659] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-46044c9d-d47d-4ffb-9921-be4c630b8289 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2044.492476] env[68906]: DEBUG nova.compute.provider_tree [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2044.499998] env[68906]: DEBUG nova.scheduler.client.report [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2044.514183] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68906) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2044.514374] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.302s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2044.917169] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8466e239-f128-4599-96da-bff01e78a993 tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Acquiring lock "cd208e67-55a3-4c0b-ad49-abd3a700d5ef" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2044.917422] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8466e239-f128-4599-96da-bff01e78a993 tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Lock "cd208e67-55a3-4c0b-ad49-abd3a700d5ef" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2045.509017] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2051.129303] env[68906]: WARNING oslo_vmware.rw_handles [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2051.129303] env[68906]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2051.129303] env[68906]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2051.129303] env[68906]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2051.129303] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2051.129303] env[68906]: ERROR oslo_vmware.rw_handles response.begin() [ 2051.129303] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2051.129303] env[68906]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2051.129303] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2051.129303] env[68906]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2051.129303] env[68906]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2051.129303] env[68906]: ERROR oslo_vmware.rw_handles [ 2051.129901] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Downloaded image file data b1400c31-d33b-4e13-944f-4c645e62493e to vmware_temp/b55717de-c44e-4111-b3ad-494e49989dba/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2051.132548] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Caching image {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2051.132783] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Copying Virtual Disk [datastore2] vmware_temp/b55717de-c44e-4111-b3ad-494e49989dba/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk to [datastore2] vmware_temp/b55717de-c44e-4111-b3ad-494e49989dba/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk {{(pid=68906) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2051.133081] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-9e44c67d-1cce-43ab-854e-cba889161699 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2051.140973] env[68906]: DEBUG oslo_vmware.api [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Waiting for the task: (returnval){ [ 2051.140973] env[68906]: value = "task-3475456" [ 2051.140973] env[68906]: _type = "Task" [ 2051.140973] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2051.150826] env[68906]: DEBUG oslo_vmware.api [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Task: {'id': task-3475456, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2051.651529] env[68906]: DEBUG oslo_vmware.exceptions [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Fault InvalidArgument not matched. {{(pid=68906) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2051.651825] env[68906]: DEBUG oslo_concurrency.lockutils [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2051.652399] env[68906]: ERROR nova.compute.manager [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2051.652399] env[68906]: Faults: ['InvalidArgument'] [ 2051.652399] env[68906]: ERROR nova.compute.manager [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Traceback (most recent call last): [ 2051.652399] env[68906]: ERROR nova.compute.manager [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2051.652399] env[68906]: ERROR nova.compute.manager [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] yield resources [ 2051.652399] env[68906]: ERROR nova.compute.manager [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2051.652399] env[68906]: ERROR nova.compute.manager [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] self.driver.spawn(context, instance, image_meta, [ 2051.652399] env[68906]: ERROR nova.compute.manager [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2051.652399] env[68906]: ERROR nova.compute.manager [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2051.652399] env[68906]: ERROR nova.compute.manager [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2051.652399] env[68906]: ERROR nova.compute.manager [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] self._fetch_image_if_missing(context, vi) [ 2051.652399] env[68906]: ERROR nova.compute.manager [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2051.652666] env[68906]: ERROR nova.compute.manager [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] image_cache(vi, tmp_image_ds_loc) [ 2051.652666] env[68906]: ERROR nova.compute.manager [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2051.652666] env[68906]: ERROR nova.compute.manager [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] vm_util.copy_virtual_disk( [ 2051.652666] env[68906]: ERROR nova.compute.manager [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2051.652666] env[68906]: ERROR nova.compute.manager [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] session._wait_for_task(vmdk_copy_task) [ 2051.652666] env[68906]: ERROR nova.compute.manager [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2051.652666] env[68906]: ERROR nova.compute.manager [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] return self.wait_for_task(task_ref) [ 2051.652666] env[68906]: ERROR nova.compute.manager [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2051.652666] env[68906]: ERROR nova.compute.manager [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] return evt.wait() [ 2051.652666] env[68906]: ERROR nova.compute.manager [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2051.652666] env[68906]: ERROR nova.compute.manager [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] result = hub.switch() [ 2051.652666] env[68906]: ERROR nova.compute.manager [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2051.652666] env[68906]: ERROR nova.compute.manager [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] return self.greenlet.switch() [ 2051.652934] env[68906]: ERROR nova.compute.manager [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2051.652934] env[68906]: ERROR nova.compute.manager [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] self.f(*self.args, **self.kw) [ 2051.652934] env[68906]: ERROR nova.compute.manager [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2051.652934] env[68906]: ERROR nova.compute.manager [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] raise exceptions.translate_fault(task_info.error) [ 2051.652934] env[68906]: ERROR nova.compute.manager [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2051.652934] env[68906]: ERROR nova.compute.manager [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Faults: ['InvalidArgument'] [ 2051.652934] env[68906]: ERROR nova.compute.manager [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] [ 2051.652934] env[68906]: INFO nova.compute.manager [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Terminating instance [ 2051.654357] env[68906]: DEBUG oslo_concurrency.lockutils [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2051.654483] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2051.654695] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c9442f77-fb89-4953-9dd2-b346432ccef1 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2051.656951] env[68906]: DEBUG nova.compute.manager [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2051.657155] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2051.657898] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4e3ca5a2-fc5d-48da-80c7-b5fe19a30565 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2051.664379] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Unregistering the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2051.664619] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-0e04a9da-faf7-47ba-9658-bb7f6e20562b {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2051.666668] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2051.666838] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68906) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2051.667760] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-03240264-319f-45a0-b145-467efa750e19 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2051.672503] env[68906]: DEBUG oslo_vmware.api [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Waiting for the task: (returnval){ [ 2051.672503] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]5284dd52-7634-3546-18d6-62b8a5418f61" [ 2051.672503] env[68906]: _type = "Task" [ 2051.672503] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2051.679221] env[68906]: DEBUG oslo_vmware.api [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]5284dd52-7634-3546-18d6-62b8a5418f61, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2051.732724] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Unregistered the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2051.732938] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Deleting contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2051.733100] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Deleting the datastore file [datastore2] 32f5b54d-30bf-4fe9-9622-3ff74344b3f3 {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2051.733355] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-e4c13fc4-5921-453c-b5a0-1411b420f913 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2051.739469] env[68906]: DEBUG oslo_vmware.api [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Waiting for the task: (returnval){ [ 2051.739469] env[68906]: value = "task-3475458" [ 2051.739469] env[68906]: _type = "Task" [ 2051.739469] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2051.746569] env[68906]: DEBUG oslo_vmware.api [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Task: {'id': task-3475458, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2052.182509] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Preparing fetch location {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2052.182773] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Creating directory with path [datastore2] vmware_temp/0172a6b6-0972-4a71-a4d9-767f8d20e7d8/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2052.183018] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ac219067-35ba-4e33-9fce-15aa49c360ae {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2052.194223] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Created directory with path [datastore2] vmware_temp/0172a6b6-0972-4a71-a4d9-767f8d20e7d8/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2052.194416] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Fetch image to [datastore2] vmware_temp/0172a6b6-0972-4a71-a4d9-767f8d20e7d8/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2052.194587] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to [datastore2] vmware_temp/0172a6b6-0972-4a71-a4d9-767f8d20e7d8/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2052.195357] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33bd715c-6973-4823-b133-345bccab9a01 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2052.201947] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-785194e6-7408-4ca1-a018-c450c44f32a2 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2052.210622] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2332d782-634f-43f6-b4f9-736ac2fa1b3d {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2052.240136] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d97f8496-8eaf-4f70-9d94-7480b6d5e6e2 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2052.251334] env[68906]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-d3bc22cb-7d2b-4415-8115-ae2bf1bcce28 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2052.252994] env[68906]: DEBUG oslo_vmware.api [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Task: {'id': task-3475458, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.075625} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2052.253214] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Deleted the datastore file {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2052.253399] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Deleted contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2052.253567] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2052.253753] env[68906]: INFO nova.compute.manager [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2052.256337] env[68906]: DEBUG nova.compute.claims [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Aborting claim: {{(pid=68906) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2052.256515] env[68906]: DEBUG oslo_concurrency.lockutils [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2052.256726] env[68906]: DEBUG oslo_concurrency.lockutils [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2052.274899] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2052.384656] env[68906]: DEBUG oslo_vmware.rw_handles [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0172a6b6-0972-4a71-a4d9-767f8d20e7d8/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2052.444937] env[68906]: DEBUG oslo_vmware.rw_handles [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Completed reading data from the image iterator. {{(pid=68906) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2052.445073] env[68906]: DEBUG oslo_vmware.rw_handles [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0172a6b6-0972-4a71-a4d9-767f8d20e7d8/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2052.506873] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-651541fb-b580-48a0-b4cd-346ba3da34a6 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2052.514826] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-359d83c8-d4e6-4a73-864c-49958fd6b368 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2052.543903] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e3979327-e594-470e-acea-364097c79feb {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2052.551980] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf183068-84a0-4d4f-979f-312a514273bb {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2052.564558] env[68906]: DEBUG nova.compute.provider_tree [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2052.573131] env[68906]: DEBUG nova.scheduler.client.report [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2052.585633] env[68906]: DEBUG oslo_concurrency.lockutils [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.329s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2052.586163] env[68906]: ERROR nova.compute.manager [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2052.586163] env[68906]: Faults: ['InvalidArgument'] [ 2052.586163] env[68906]: ERROR nova.compute.manager [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Traceback (most recent call last): [ 2052.586163] env[68906]: ERROR nova.compute.manager [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2052.586163] env[68906]: ERROR nova.compute.manager [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] self.driver.spawn(context, instance, image_meta, [ 2052.586163] env[68906]: ERROR nova.compute.manager [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2052.586163] env[68906]: ERROR nova.compute.manager [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2052.586163] env[68906]: ERROR nova.compute.manager [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2052.586163] env[68906]: ERROR nova.compute.manager [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] self._fetch_image_if_missing(context, vi) [ 2052.586163] env[68906]: ERROR nova.compute.manager [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2052.586163] env[68906]: ERROR nova.compute.manager [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] image_cache(vi, tmp_image_ds_loc) [ 2052.586163] env[68906]: ERROR nova.compute.manager [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2052.586628] env[68906]: ERROR nova.compute.manager [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] vm_util.copy_virtual_disk( [ 2052.586628] env[68906]: ERROR nova.compute.manager [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2052.586628] env[68906]: ERROR nova.compute.manager [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] session._wait_for_task(vmdk_copy_task) [ 2052.586628] env[68906]: ERROR nova.compute.manager [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2052.586628] env[68906]: ERROR nova.compute.manager [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] return self.wait_for_task(task_ref) [ 2052.586628] env[68906]: ERROR nova.compute.manager [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2052.586628] env[68906]: ERROR nova.compute.manager [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] return evt.wait() [ 2052.586628] env[68906]: ERROR nova.compute.manager [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2052.586628] env[68906]: ERROR nova.compute.manager [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] result = hub.switch() [ 2052.586628] env[68906]: ERROR nova.compute.manager [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2052.586628] env[68906]: ERROR nova.compute.manager [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] return self.greenlet.switch() [ 2052.586628] env[68906]: ERROR nova.compute.manager [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2052.586628] env[68906]: ERROR nova.compute.manager [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] self.f(*self.args, **self.kw) [ 2052.586888] env[68906]: ERROR nova.compute.manager [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2052.586888] env[68906]: ERROR nova.compute.manager [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] raise exceptions.translate_fault(task_info.error) [ 2052.586888] env[68906]: ERROR nova.compute.manager [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2052.586888] env[68906]: ERROR nova.compute.manager [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Faults: ['InvalidArgument'] [ 2052.586888] env[68906]: ERROR nova.compute.manager [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] [ 2052.586888] env[68906]: DEBUG nova.compute.utils [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] VimFaultException {{(pid=68906) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2052.588222] env[68906]: DEBUG nova.compute.manager [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Build of instance 32f5b54d-30bf-4fe9-9622-3ff74344b3f3 was re-scheduled: A specified parameter was not correct: fileType [ 2052.588222] env[68906]: Faults: ['InvalidArgument'] {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2052.588617] env[68906]: DEBUG nova.compute.manager [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Unplugging VIFs for instance {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2052.588793] env[68906]: DEBUG nova.compute.manager [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2052.588965] env[68906]: DEBUG nova.compute.manager [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2052.589146] env[68906]: DEBUG nova.network.neutron [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2052.895946] env[68906]: DEBUG nova.network.neutron [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2052.910460] env[68906]: INFO nova.compute.manager [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Took 0.32 seconds to deallocate network for instance. [ 2053.008472] env[68906]: INFO nova.scheduler.client.report [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Deleted allocations for instance 32f5b54d-30bf-4fe9-9622-3ff74344b3f3 [ 2053.028857] env[68906]: DEBUG oslo_concurrency.lockutils [None req-1f23ce48-c412-41ea-9aae-c46a8abfa944 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Lock "32f5b54d-30bf-4fe9-9622-3ff74344b3f3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 630.980s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2053.030615] env[68906]: DEBUG oslo_concurrency.lockutils [None req-576bfd48-3236-4615-8685-9b78101a6ee3 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Lock "32f5b54d-30bf-4fe9-9622-3ff74344b3f3" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 435.082s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2053.030872] env[68906]: DEBUG oslo_concurrency.lockutils [None req-576bfd48-3236-4615-8685-9b78101a6ee3 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Acquiring lock "32f5b54d-30bf-4fe9-9622-3ff74344b3f3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2053.031112] env[68906]: DEBUG oslo_concurrency.lockutils [None req-576bfd48-3236-4615-8685-9b78101a6ee3 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Lock "32f5b54d-30bf-4fe9-9622-3ff74344b3f3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2053.031263] env[68906]: DEBUG oslo_concurrency.lockutils [None req-576bfd48-3236-4615-8685-9b78101a6ee3 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Lock "32f5b54d-30bf-4fe9-9622-3ff74344b3f3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2053.033126] env[68906]: INFO nova.compute.manager [None req-576bfd48-3236-4615-8685-9b78101a6ee3 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Terminating instance [ 2053.034848] env[68906]: DEBUG nova.compute.manager [None req-576bfd48-3236-4615-8685-9b78101a6ee3 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2053.035086] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-576bfd48-3236-4615-8685-9b78101a6ee3 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2053.035642] env[68906]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-b0488767-2cbd-4552-bb58-fe8b2f04a60b {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2053.041474] env[68906]: DEBUG nova.compute.manager [None req-a0bccc79-5040-43c0-84e3-cf4ab89f848f tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] [instance: ed276c3c-6085-427d-b3b7-86bbb8660dbc] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2053.048121] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ef67e3c-762a-46fe-85c4-8c5cd760d6ee {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2053.076765] env[68906]: WARNING nova.virt.vmwareapi.vmops [None req-576bfd48-3236-4615-8685-9b78101a6ee3 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 32f5b54d-30bf-4fe9-9622-3ff74344b3f3 could not be found. [ 2053.076975] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-576bfd48-3236-4615-8685-9b78101a6ee3 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2053.077167] env[68906]: INFO nova.compute.manager [None req-576bfd48-3236-4615-8685-9b78101a6ee3 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2053.077435] env[68906]: DEBUG oslo.service.loopingcall [None req-576bfd48-3236-4615-8685-9b78101a6ee3 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2053.078308] env[68906]: DEBUG nova.compute.manager [-] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2053.078415] env[68906]: DEBUG nova.network.neutron [-] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2053.094865] env[68906]: DEBUG oslo_concurrency.lockutils [None req-a0bccc79-5040-43c0-84e3-cf4ab89f848f tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2053.095113] env[68906]: DEBUG oslo_concurrency.lockutils [None req-a0bccc79-5040-43c0-84e3-cf4ab89f848f tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2053.096515] env[68906]: INFO nova.compute.claims [None req-a0bccc79-5040-43c0-84e3-cf4ab89f848f tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] [instance: ed276c3c-6085-427d-b3b7-86bbb8660dbc] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2053.114604] env[68906]: DEBUG nova.network.neutron [-] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2053.138709] env[68906]: INFO nova.compute.manager [-] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] Took 0.06 seconds to deallocate network for instance. [ 2053.233813] env[68906]: DEBUG oslo_concurrency.lockutils [None req-576bfd48-3236-4615-8685-9b78101a6ee3 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Lock "32f5b54d-30bf-4fe9-9622-3ff74344b3f3" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.203s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2053.234684] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "32f5b54d-30bf-4fe9-9622-3ff74344b3f3" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 230.064s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2053.235308] env[68906]: INFO nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 32f5b54d-30bf-4fe9-9622-3ff74344b3f3] During sync_power_state the instance has a pending task (deleting). Skip. [ 2053.235308] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "32f5b54d-30bf-4fe9-9622-3ff74344b3f3" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2053.294413] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0e788f4b-892c-4123-8179-7ebbe77f5693 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2053.302050] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f9fee5a8-238b-4968-befc-c548d75ab1ed {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2053.332836] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f3f9603-a12e-4f8c-b468-b6de1c44bfd8 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2053.340020] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4e3c0962-b3f7-4c74-89a8-cbe9d020cc90 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2053.353056] env[68906]: DEBUG nova.compute.provider_tree [None req-a0bccc79-5040-43c0-84e3-cf4ab89f848f tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2053.361972] env[68906]: DEBUG nova.scheduler.client.report [None req-a0bccc79-5040-43c0-84e3-cf4ab89f848f tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2053.375546] env[68906]: DEBUG oslo_concurrency.lockutils [None req-a0bccc79-5040-43c0-84e3-cf4ab89f848f tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.280s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2053.376067] env[68906]: DEBUG nova.compute.manager [None req-a0bccc79-5040-43c0-84e3-cf4ab89f848f tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] [instance: ed276c3c-6085-427d-b3b7-86bbb8660dbc] Start building networks asynchronously for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 2053.410928] env[68906]: DEBUG nova.compute.utils [None req-a0bccc79-5040-43c0-84e3-cf4ab89f848f tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] Using /dev/sd instead of None {{(pid=68906) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2053.412657] env[68906]: DEBUG nova.compute.manager [None req-a0bccc79-5040-43c0-84e3-cf4ab89f848f tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] [instance: ed276c3c-6085-427d-b3b7-86bbb8660dbc] Allocating IP information in the background. {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2053.412939] env[68906]: DEBUG nova.network.neutron [None req-a0bccc79-5040-43c0-84e3-cf4ab89f848f tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] [instance: ed276c3c-6085-427d-b3b7-86bbb8660dbc] allocate_for_instance() {{(pid=68906) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2053.423234] env[68906]: DEBUG nova.compute.manager [None req-a0bccc79-5040-43c0-84e3-cf4ab89f848f tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] [instance: ed276c3c-6085-427d-b3b7-86bbb8660dbc] Start building block device mappings for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 2053.493397] env[68906]: DEBUG nova.compute.manager [None req-a0bccc79-5040-43c0-84e3-cf4ab89f848f tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] [instance: ed276c3c-6085-427d-b3b7-86bbb8660dbc] Start spawning the instance on the hypervisor. {{(pid=68906) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 2053.497058] env[68906]: DEBUG nova.policy [None req-a0bccc79-5040-43c0-84e3-cf4ab89f848f tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '37f9526491514cb78fe2d78897a9f6da', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ab3b2dcf3ab6492caa68615817414acd', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68906) authorize /opt/stack/nova/nova/policy.py:203}} [ 2053.529059] env[68906]: DEBUG nova.virt.hardware [None req-a0bccc79-5040-43c0-84e3-cf4ab89f848f tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T13:00:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T13:00:23Z,direct_url=,disk_format='vmdk',id=b1400c31-d33b-4e13-944f-4c645e62493e,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='1ae7bf3a375d41c6af5e7536af51ffd1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T13:00:24Z,virtual_size=,visibility=), allow threads: False {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2053.529330] env[68906]: DEBUG nova.virt.hardware [None req-a0bccc79-5040-43c0-84e3-cf4ab89f848f tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] Flavor limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2053.529494] env[68906]: DEBUG nova.virt.hardware [None req-a0bccc79-5040-43c0-84e3-cf4ab89f848f tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] Image limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2053.529669] env[68906]: DEBUG nova.virt.hardware [None req-a0bccc79-5040-43c0-84e3-cf4ab89f848f tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] Flavor pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2053.529838] env[68906]: DEBUG nova.virt.hardware [None req-a0bccc79-5040-43c0-84e3-cf4ab89f848f tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] Image pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2053.529993] env[68906]: DEBUG nova.virt.hardware [None req-a0bccc79-5040-43c0-84e3-cf4ab89f848f tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2053.530359] env[68906]: DEBUG nova.virt.hardware [None req-a0bccc79-5040-43c0-84e3-cf4ab89f848f tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2053.530595] env[68906]: DEBUG nova.virt.hardware [None req-a0bccc79-5040-43c0-84e3-cf4ab89f848f tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2053.530827] env[68906]: DEBUG nova.virt.hardware [None req-a0bccc79-5040-43c0-84e3-cf4ab89f848f tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] Got 1 possible topologies {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2053.531248] env[68906]: DEBUG nova.virt.hardware [None req-a0bccc79-5040-43c0-84e3-cf4ab89f848f tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2053.531248] env[68906]: DEBUG nova.virt.hardware [None req-a0bccc79-5040-43c0-84e3-cf4ab89f848f tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2053.535022] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-37bd27fc-a8c6-4645-ae60-1c1fcf6d467c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2053.540541] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-931ba911-c505-4c6c-86d6-fc8996d5c1ec {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2053.798668] env[68906]: DEBUG nova.network.neutron [None req-a0bccc79-5040-43c0-84e3-cf4ab89f848f tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] [instance: ed276c3c-6085-427d-b3b7-86bbb8660dbc] Successfully created port: 4daa4e35-0b69-4a40-af3d-cb10d06b86f5 {{(pid=68906) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2054.356606] env[68906]: DEBUG nova.network.neutron [None req-a0bccc79-5040-43c0-84e3-cf4ab89f848f tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] [instance: ed276c3c-6085-427d-b3b7-86bbb8660dbc] Successfully updated port: 4daa4e35-0b69-4a40-af3d-cb10d06b86f5 {{(pid=68906) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2054.368160] env[68906]: DEBUG oslo_concurrency.lockutils [None req-a0bccc79-5040-43c0-84e3-cf4ab89f848f tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] Acquiring lock "refresh_cache-ed276c3c-6085-427d-b3b7-86bbb8660dbc" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2054.368316] env[68906]: DEBUG oslo_concurrency.lockutils [None req-a0bccc79-5040-43c0-84e3-cf4ab89f848f tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] Acquired lock "refresh_cache-ed276c3c-6085-427d-b3b7-86bbb8660dbc" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2054.368496] env[68906]: DEBUG nova.network.neutron [None req-a0bccc79-5040-43c0-84e3-cf4ab89f848f tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] [instance: ed276c3c-6085-427d-b3b7-86bbb8660dbc] Building network info cache for instance {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2054.404492] env[68906]: DEBUG nova.network.neutron [None req-a0bccc79-5040-43c0-84e3-cf4ab89f848f tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] [instance: ed276c3c-6085-427d-b3b7-86bbb8660dbc] Instance cache missing network info. {{(pid=68906) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2054.560038] env[68906]: DEBUG nova.network.neutron [None req-a0bccc79-5040-43c0-84e3-cf4ab89f848f tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] [instance: ed276c3c-6085-427d-b3b7-86bbb8660dbc] Updating instance_info_cache with network_info: [{"id": "4daa4e35-0b69-4a40-af3d-cb10d06b86f5", "address": "fa:16:3e:f2:a1:5e", "network": {"id": "54b104a2-9ae1-471c-8a6a-6eff5a3f91d0", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-467006053-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ab3b2dcf3ab6492caa68615817414acd", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ed9dc063-5c7a-4591-ba7d-b58b861d7f63", "external-id": "nsx-vlan-transportzone-37", "segmentation_id": 37, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4daa4e35-0b", "ovs_interfaceid": "4daa4e35-0b69-4a40-af3d-cb10d06b86f5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2054.571638] env[68906]: DEBUG oslo_concurrency.lockutils [None req-a0bccc79-5040-43c0-84e3-cf4ab89f848f tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] Releasing lock "refresh_cache-ed276c3c-6085-427d-b3b7-86bbb8660dbc" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2054.571638] env[68906]: DEBUG nova.compute.manager [None req-a0bccc79-5040-43c0-84e3-cf4ab89f848f tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] [instance: ed276c3c-6085-427d-b3b7-86bbb8660dbc] Instance network_info: |[{"id": "4daa4e35-0b69-4a40-af3d-cb10d06b86f5", "address": "fa:16:3e:f2:a1:5e", "network": {"id": "54b104a2-9ae1-471c-8a6a-6eff5a3f91d0", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-467006053-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ab3b2dcf3ab6492caa68615817414acd", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ed9dc063-5c7a-4591-ba7d-b58b861d7f63", "external-id": "nsx-vlan-transportzone-37", "segmentation_id": 37, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4daa4e35-0b", "ovs_interfaceid": "4daa4e35-0b69-4a40-af3d-cb10d06b86f5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2054.571845] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-a0bccc79-5040-43c0-84e3-cf4ab89f848f tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] [instance: ed276c3c-6085-427d-b3b7-86bbb8660dbc] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:f2:a1:5e', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ed9dc063-5c7a-4591-ba7d-b58b861d7f63', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '4daa4e35-0b69-4a40-af3d-cb10d06b86f5', 'vif_model': 'vmxnet3'}] {{(pid=68906) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2054.579626] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-a0bccc79-5040-43c0-84e3-cf4ab89f848f tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] Creating folder: Project (ab3b2dcf3ab6492caa68615817414acd). Parent ref: group-v694750. {{(pid=68906) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 2054.580197] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-99fef918-d022-47ac-8fb7-e581ce5ad9b6 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2054.590430] env[68906]: INFO nova.virt.vmwareapi.vm_util [None req-a0bccc79-5040-43c0-84e3-cf4ab89f848f tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] Created folder: Project (ab3b2dcf3ab6492caa68615817414acd) in parent group-v694750. [ 2054.590602] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-a0bccc79-5040-43c0-84e3-cf4ab89f848f tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] Creating folder: Instances. Parent ref: group-v694856. {{(pid=68906) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 2054.590813] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-781e3c59-bf23-47bc-b37b-87ddbbb5f761 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2054.598875] env[68906]: INFO nova.virt.vmwareapi.vm_util [None req-a0bccc79-5040-43c0-84e3-cf4ab89f848f tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] Created folder: Instances in parent group-v694856. [ 2054.599604] env[68906]: DEBUG oslo.service.loopingcall [None req-a0bccc79-5040-43c0-84e3-cf4ab89f848f tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2054.599604] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ed276c3c-6085-427d-b3b7-86bbb8660dbc] Creating VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2054.599604] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-faaec11e-93b2-4bb0-bd7f-30d9ab253eae {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2054.617817] env[68906]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2054.617817] env[68906]: value = "task-3475461" [ 2054.617817] env[68906]: _type = "Task" [ 2054.617817] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2054.624909] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475461, 'name': CreateVM_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2054.935299] env[68906]: DEBUG nova.compute.manager [req-af3ced77-bb9c-4eb0-b61d-e8c50f1c5d60 req-2198151a-434e-4c6b-8a21-665ccfdc5180 service nova] [instance: ed276c3c-6085-427d-b3b7-86bbb8660dbc] Received event network-vif-plugged-4daa4e35-0b69-4a40-af3d-cb10d06b86f5 {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2054.935893] env[68906]: DEBUG oslo_concurrency.lockutils [req-af3ced77-bb9c-4eb0-b61d-e8c50f1c5d60 req-2198151a-434e-4c6b-8a21-665ccfdc5180 service nova] Acquiring lock "ed276c3c-6085-427d-b3b7-86bbb8660dbc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2054.936137] env[68906]: DEBUG oslo_concurrency.lockutils [req-af3ced77-bb9c-4eb0-b61d-e8c50f1c5d60 req-2198151a-434e-4c6b-8a21-665ccfdc5180 service nova] Lock "ed276c3c-6085-427d-b3b7-86bbb8660dbc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2054.936317] env[68906]: DEBUG oslo_concurrency.lockutils [req-af3ced77-bb9c-4eb0-b61d-e8c50f1c5d60 req-2198151a-434e-4c6b-8a21-665ccfdc5180 service nova] Lock "ed276c3c-6085-427d-b3b7-86bbb8660dbc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2054.936493] env[68906]: DEBUG nova.compute.manager [req-af3ced77-bb9c-4eb0-b61d-e8c50f1c5d60 req-2198151a-434e-4c6b-8a21-665ccfdc5180 service nova] [instance: ed276c3c-6085-427d-b3b7-86bbb8660dbc] No waiting events found dispatching network-vif-plugged-4daa4e35-0b69-4a40-af3d-cb10d06b86f5 {{(pid=68906) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2054.936661] env[68906]: WARNING nova.compute.manager [req-af3ced77-bb9c-4eb0-b61d-e8c50f1c5d60 req-2198151a-434e-4c6b-8a21-665ccfdc5180 service nova] [instance: ed276c3c-6085-427d-b3b7-86bbb8660dbc] Received unexpected event network-vif-plugged-4daa4e35-0b69-4a40-af3d-cb10d06b86f5 for instance with vm_state building and task_state spawning. [ 2054.936824] env[68906]: DEBUG nova.compute.manager [req-af3ced77-bb9c-4eb0-b61d-e8c50f1c5d60 req-2198151a-434e-4c6b-8a21-665ccfdc5180 service nova] [instance: ed276c3c-6085-427d-b3b7-86bbb8660dbc] Received event network-changed-4daa4e35-0b69-4a40-af3d-cb10d06b86f5 {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2054.936978] env[68906]: DEBUG nova.compute.manager [req-af3ced77-bb9c-4eb0-b61d-e8c50f1c5d60 req-2198151a-434e-4c6b-8a21-665ccfdc5180 service nova] [instance: ed276c3c-6085-427d-b3b7-86bbb8660dbc] Refreshing instance network info cache due to event network-changed-4daa4e35-0b69-4a40-af3d-cb10d06b86f5. {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 2054.937176] env[68906]: DEBUG oslo_concurrency.lockutils [req-af3ced77-bb9c-4eb0-b61d-e8c50f1c5d60 req-2198151a-434e-4c6b-8a21-665ccfdc5180 service nova] Acquiring lock "refresh_cache-ed276c3c-6085-427d-b3b7-86bbb8660dbc" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2054.937315] env[68906]: DEBUG oslo_concurrency.lockutils [req-af3ced77-bb9c-4eb0-b61d-e8c50f1c5d60 req-2198151a-434e-4c6b-8a21-665ccfdc5180 service nova] Acquired lock "refresh_cache-ed276c3c-6085-427d-b3b7-86bbb8660dbc" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2054.937502] env[68906]: DEBUG nova.network.neutron [req-af3ced77-bb9c-4eb0-b61d-e8c50f1c5d60 req-2198151a-434e-4c6b-8a21-665ccfdc5180 service nova] [instance: ed276c3c-6085-427d-b3b7-86bbb8660dbc] Refreshing network info cache for port 4daa4e35-0b69-4a40-af3d-cb10d06b86f5 {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2055.129543] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475461, 'name': CreateVM_Task, 'duration_secs': 0.296139} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2055.129739] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ed276c3c-6085-427d-b3b7-86bbb8660dbc] Created VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2055.130402] env[68906]: DEBUG oslo_concurrency.lockutils [None req-a0bccc79-5040-43c0-84e3-cf4ab89f848f tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2055.130569] env[68906]: DEBUG oslo_concurrency.lockutils [None req-a0bccc79-5040-43c0-84e3-cf4ab89f848f tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2055.130890] env[68906]: DEBUG oslo_concurrency.lockutils [None req-a0bccc79-5040-43c0-84e3-cf4ab89f848f tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2055.131146] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-72a4413b-6db1-4900-be06-7c9c0aee6dd3 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2055.135336] env[68906]: DEBUG oslo_vmware.api [None req-a0bccc79-5040-43c0-84e3-cf4ab89f848f tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] Waiting for the task: (returnval){ [ 2055.135336] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]5289f755-b46e-d216-8d34-718cfba2d232" [ 2055.135336] env[68906]: _type = "Task" [ 2055.135336] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2055.142492] env[68906]: DEBUG oslo_vmware.api [None req-a0bccc79-5040-43c0-84e3-cf4ab89f848f tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]5289f755-b46e-d216-8d34-718cfba2d232, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2055.164581] env[68906]: DEBUG nova.network.neutron [req-af3ced77-bb9c-4eb0-b61d-e8c50f1c5d60 req-2198151a-434e-4c6b-8a21-665ccfdc5180 service nova] [instance: ed276c3c-6085-427d-b3b7-86bbb8660dbc] Updated VIF entry in instance network info cache for port 4daa4e35-0b69-4a40-af3d-cb10d06b86f5. {{(pid=68906) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2055.164907] env[68906]: DEBUG nova.network.neutron [req-af3ced77-bb9c-4eb0-b61d-e8c50f1c5d60 req-2198151a-434e-4c6b-8a21-665ccfdc5180 service nova] [instance: ed276c3c-6085-427d-b3b7-86bbb8660dbc] Updating instance_info_cache with network_info: [{"id": "4daa4e35-0b69-4a40-af3d-cb10d06b86f5", "address": "fa:16:3e:f2:a1:5e", "network": {"id": "54b104a2-9ae1-471c-8a6a-6eff5a3f91d0", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-467006053-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ab3b2dcf3ab6492caa68615817414acd", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ed9dc063-5c7a-4591-ba7d-b58b861d7f63", "external-id": "nsx-vlan-transportzone-37", "segmentation_id": 37, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4daa4e35-0b", "ovs_interfaceid": "4daa4e35-0b69-4a40-af3d-cb10d06b86f5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2055.174158] env[68906]: DEBUG oslo_concurrency.lockutils [req-af3ced77-bb9c-4eb0-b61d-e8c50f1c5d60 req-2198151a-434e-4c6b-8a21-665ccfdc5180 service nova] Releasing lock "refresh_cache-ed276c3c-6085-427d-b3b7-86bbb8660dbc" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2055.645515] env[68906]: DEBUG oslo_concurrency.lockutils [None req-a0bccc79-5040-43c0-84e3-cf4ab89f848f tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2055.646152] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-a0bccc79-5040-43c0-84e3-cf4ab89f848f tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] [instance: ed276c3c-6085-427d-b3b7-86bbb8660dbc] Processing image b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2055.646152] env[68906]: DEBUG oslo_concurrency.lockutils [None req-a0bccc79-5040-43c0-84e3-cf4ab89f848f tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2071.416632] env[68906]: DEBUG oslo_concurrency.lockutils [None req-abb6e557-a4c5-4fae-95f1-d08eed20205b tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Acquiring lock "a4f1c6a3-c189-4e3a-8ac9-ac6ec8b95723" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2071.416982] env[68906]: DEBUG oslo_concurrency.lockutils [None req-abb6e557-a4c5-4fae-95f1-d08eed20205b tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Lock "a4f1c6a3-c189-4e3a-8ac9-ac6ec8b95723" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2078.842363] env[68906]: DEBUG oslo_concurrency.lockutils [None req-05fb8c3a-bcb0-4441-8aaa-03f6402644c5 tempest-DeleteServersTestJSON-1763795391 tempest-DeleteServersTestJSON-1763795391-project-member] Acquiring lock "ed276c3c-6085-427d-b3b7-86bbb8660dbc" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2088.141413] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2094.140616] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2095.141423] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2095.141761] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Starting heal instance info cache {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2095.141761] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Rebuilding the list of instances to heal {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2095.163636] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2095.163785] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2095.163907] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2095.164042] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2095.164169] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2095.164290] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2095.164410] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2095.164529] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2095.164648] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: d70b039d-c8ad-4ffd-84f8-08f17cb97578] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2095.164766] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: ed276c3c-6085-427d-b3b7-86bbb8660dbc] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2095.164886] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Didn't find any instances for network info cache update. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2096.141028] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2097.140397] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2098.140270] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2098.140440] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Cleaning up deleted instances with incomplete migration {{(pid=68906) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 2099.150626] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2100.587459] env[68906]: WARNING oslo_vmware.rw_handles [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2100.587459] env[68906]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2100.587459] env[68906]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2100.587459] env[68906]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2100.587459] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2100.587459] env[68906]: ERROR oslo_vmware.rw_handles response.begin() [ 2100.587459] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2100.587459] env[68906]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2100.587459] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2100.587459] env[68906]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2100.587459] env[68906]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2100.587459] env[68906]: ERROR oslo_vmware.rw_handles [ 2100.588022] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Downloaded image file data b1400c31-d33b-4e13-944f-4c645e62493e to vmware_temp/0172a6b6-0972-4a71-a4d9-767f8d20e7d8/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2100.589955] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Caching image {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2100.590268] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Copying Virtual Disk [datastore2] vmware_temp/0172a6b6-0972-4a71-a4d9-767f8d20e7d8/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk to [datastore2] vmware_temp/0172a6b6-0972-4a71-a4d9-767f8d20e7d8/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk {{(pid=68906) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2100.591036] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-148b4047-ffd2-4529-864d-530c47f3ccb2 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2100.598981] env[68906]: DEBUG oslo_vmware.api [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Waiting for the task: (returnval){ [ 2100.598981] env[68906]: value = "task-3475462" [ 2100.598981] env[68906]: _type = "Task" [ 2100.598981] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2100.607765] env[68906]: DEBUG oslo_vmware.api [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Task: {'id': task-3475462, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2101.109665] env[68906]: DEBUG oslo_vmware.exceptions [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Fault InvalidArgument not matched. {{(pid=68906) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2101.109967] env[68906]: DEBUG oslo_concurrency.lockutils [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2101.110537] env[68906]: ERROR nova.compute.manager [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2101.110537] env[68906]: Faults: ['InvalidArgument'] [ 2101.110537] env[68906]: ERROR nova.compute.manager [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Traceback (most recent call last): [ 2101.110537] env[68906]: ERROR nova.compute.manager [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2101.110537] env[68906]: ERROR nova.compute.manager [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] yield resources [ 2101.110537] env[68906]: ERROR nova.compute.manager [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2101.110537] env[68906]: ERROR nova.compute.manager [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] self.driver.spawn(context, instance, image_meta, [ 2101.110537] env[68906]: ERROR nova.compute.manager [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2101.110537] env[68906]: ERROR nova.compute.manager [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2101.110537] env[68906]: ERROR nova.compute.manager [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2101.110537] env[68906]: ERROR nova.compute.manager [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] self._fetch_image_if_missing(context, vi) [ 2101.110537] env[68906]: ERROR nova.compute.manager [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2101.110941] env[68906]: ERROR nova.compute.manager [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] image_cache(vi, tmp_image_ds_loc) [ 2101.110941] env[68906]: ERROR nova.compute.manager [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2101.110941] env[68906]: ERROR nova.compute.manager [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] vm_util.copy_virtual_disk( [ 2101.110941] env[68906]: ERROR nova.compute.manager [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2101.110941] env[68906]: ERROR nova.compute.manager [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] session._wait_for_task(vmdk_copy_task) [ 2101.110941] env[68906]: ERROR nova.compute.manager [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2101.110941] env[68906]: ERROR nova.compute.manager [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] return self.wait_for_task(task_ref) [ 2101.110941] env[68906]: ERROR nova.compute.manager [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2101.110941] env[68906]: ERROR nova.compute.manager [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] return evt.wait() [ 2101.110941] env[68906]: ERROR nova.compute.manager [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2101.110941] env[68906]: ERROR nova.compute.manager [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] result = hub.switch() [ 2101.110941] env[68906]: ERROR nova.compute.manager [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2101.110941] env[68906]: ERROR nova.compute.manager [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] return self.greenlet.switch() [ 2101.111312] env[68906]: ERROR nova.compute.manager [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2101.111312] env[68906]: ERROR nova.compute.manager [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] self.f(*self.args, **self.kw) [ 2101.111312] env[68906]: ERROR nova.compute.manager [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2101.111312] env[68906]: ERROR nova.compute.manager [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] raise exceptions.translate_fault(task_info.error) [ 2101.111312] env[68906]: ERROR nova.compute.manager [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2101.111312] env[68906]: ERROR nova.compute.manager [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Faults: ['InvalidArgument'] [ 2101.111312] env[68906]: ERROR nova.compute.manager [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] [ 2101.111312] env[68906]: INFO nova.compute.manager [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Terminating instance [ 2101.112348] env[68906]: DEBUG oslo_concurrency.lockutils [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2101.112553] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2101.112788] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0c6729c0-6df8-4f23-83ff-9d87fa91ead0 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2101.114970] env[68906]: DEBUG nova.compute.manager [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2101.115177] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2101.115875] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c26788f1-b6cf-4b6f-9027-9fefc4f2f06b {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2101.122762] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Unregistering the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2101.122974] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-2c1d6c63-051f-4e55-bcea-4ff1cd8148c5 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2101.125094] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2101.125266] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68906) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2101.126192] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-bbfc32ce-53b7-4e0f-9e14-88b11a7b60bf {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2101.130973] env[68906]: DEBUG oslo_vmware.api [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Waiting for the task: (returnval){ [ 2101.130973] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]524f0213-ecf7-509c-daf3-9a04598f1670" [ 2101.130973] env[68906]: _type = "Task" [ 2101.130973] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2101.140414] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2101.140564] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68906) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2101.145481] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Preparing fetch location {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2101.145709] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Creating directory with path [datastore2] vmware_temp/24af176a-dd32-4244-a437-c10f0bcc4e08/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2101.145910] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f471a21e-9a2c-4580-aa27-0a580f331cf4 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2101.166473] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Created directory with path [datastore2] vmware_temp/24af176a-dd32-4244-a437-c10f0bcc4e08/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2101.166676] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Fetch image to [datastore2] vmware_temp/24af176a-dd32-4244-a437-c10f0bcc4e08/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2101.166851] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to [datastore2] vmware_temp/24af176a-dd32-4244-a437-c10f0bcc4e08/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2101.167659] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8cd109b6-25d0-4ba1-8fc1-baa653e739ea {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2101.174615] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9e75be6-c6ba-4bc7-a69f-889c752f9d59 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2101.183745] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9bb0d644-510a-4d3c-aa0d-d58338499b9a {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2101.215805] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b30ef98-61bb-41e2-b00a-d6949c8e5500 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2101.218209] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Unregistered the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2101.218402] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Deleting contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2101.218600] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Deleting the datastore file [datastore2] 922d81ba-c8d2-43ba-b1c5-f2943418d6a2 {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2101.218818] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-26e40eb9-b92f-4695-9fcf-7a71ed2744cf {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2101.223538] env[68906]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-1ca48b18-ae98-4ad1-b9a3-2086fdc13da5 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2101.226262] env[68906]: DEBUG oslo_vmware.api [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Waiting for the task: (returnval){ [ 2101.226262] env[68906]: value = "task-3475464" [ 2101.226262] env[68906]: _type = "Task" [ 2101.226262] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2101.234359] env[68906]: DEBUG oslo_vmware.api [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Task: {'id': task-3475464, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2101.244664] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2101.362176] env[68906]: DEBUG oslo_vmware.rw_handles [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/24af176a-dd32-4244-a437-c10f0bcc4e08/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2101.421216] env[68906]: DEBUG oslo_vmware.rw_handles [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Completed reading data from the image iterator. {{(pid=68906) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2101.421450] env[68906]: DEBUG oslo_vmware.rw_handles [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/24af176a-dd32-4244-a437-c10f0bcc4e08/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2101.736449] env[68906]: DEBUG oslo_vmware.api [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Task: {'id': task-3475464, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.069408} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2101.736794] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Deleted the datastore file {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2101.736868] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Deleted contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2101.737022] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2101.737197] env[68906]: INFO nova.compute.manager [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Took 0.62 seconds to destroy the instance on the hypervisor. [ 2101.739358] env[68906]: DEBUG nova.compute.claims [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Aborting claim: {{(pid=68906) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2101.739528] env[68906]: DEBUG oslo_concurrency.lockutils [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2101.739753] env[68906]: DEBUG oslo_concurrency.lockutils [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2101.851087] env[68906]: DEBUG nova.scheduler.client.report [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Refreshing inventories for resource provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 2101.864319] env[68906]: DEBUG nova.scheduler.client.report [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Updating ProviderTree inventory for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 2101.864539] env[68906]: DEBUG nova.compute.provider_tree [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Updating inventory in ProviderTree for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 2101.874817] env[68906]: DEBUG nova.scheduler.client.report [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Refreshing aggregate associations for resource provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b, aggregates: None {{(pid=68906) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 2101.892069] env[68906]: DEBUG nova.scheduler.client.report [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Refreshing trait associations for resource provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b, traits: COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ISO {{(pid=68906) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 2102.031802] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-970034b7-9f7c-45f8-ae18-d1732182e265 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2102.039529] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-20f58883-c537-4715-aeca-7557b75ab4ee {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2102.069178] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93fe229d-d3d2-4ea9-886b-181ca09b98ef {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2102.076306] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2bd2f9ac-8eca-42e6-b0a0-32f4fb484fef {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2102.089851] env[68906]: DEBUG nova.compute.provider_tree [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2102.098081] env[68906]: DEBUG nova.scheduler.client.report [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2102.111637] env[68906]: DEBUG oslo_concurrency.lockutils [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.372s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2102.112167] env[68906]: ERROR nova.compute.manager [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2102.112167] env[68906]: Faults: ['InvalidArgument'] [ 2102.112167] env[68906]: ERROR nova.compute.manager [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Traceback (most recent call last): [ 2102.112167] env[68906]: ERROR nova.compute.manager [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2102.112167] env[68906]: ERROR nova.compute.manager [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] self.driver.spawn(context, instance, image_meta, [ 2102.112167] env[68906]: ERROR nova.compute.manager [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2102.112167] env[68906]: ERROR nova.compute.manager [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2102.112167] env[68906]: ERROR nova.compute.manager [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2102.112167] env[68906]: ERROR nova.compute.manager [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] self._fetch_image_if_missing(context, vi) [ 2102.112167] env[68906]: ERROR nova.compute.manager [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2102.112167] env[68906]: ERROR nova.compute.manager [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] image_cache(vi, tmp_image_ds_loc) [ 2102.112167] env[68906]: ERROR nova.compute.manager [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2102.112447] env[68906]: ERROR nova.compute.manager [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] vm_util.copy_virtual_disk( [ 2102.112447] env[68906]: ERROR nova.compute.manager [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2102.112447] env[68906]: ERROR nova.compute.manager [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] session._wait_for_task(vmdk_copy_task) [ 2102.112447] env[68906]: ERROR nova.compute.manager [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2102.112447] env[68906]: ERROR nova.compute.manager [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] return self.wait_for_task(task_ref) [ 2102.112447] env[68906]: ERROR nova.compute.manager [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2102.112447] env[68906]: ERROR nova.compute.manager [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] return evt.wait() [ 2102.112447] env[68906]: ERROR nova.compute.manager [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2102.112447] env[68906]: ERROR nova.compute.manager [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] result = hub.switch() [ 2102.112447] env[68906]: ERROR nova.compute.manager [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2102.112447] env[68906]: ERROR nova.compute.manager [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] return self.greenlet.switch() [ 2102.112447] env[68906]: ERROR nova.compute.manager [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2102.112447] env[68906]: ERROR nova.compute.manager [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] self.f(*self.args, **self.kw) [ 2102.112709] env[68906]: ERROR nova.compute.manager [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2102.112709] env[68906]: ERROR nova.compute.manager [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] raise exceptions.translate_fault(task_info.error) [ 2102.112709] env[68906]: ERROR nova.compute.manager [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2102.112709] env[68906]: ERROR nova.compute.manager [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Faults: ['InvalidArgument'] [ 2102.112709] env[68906]: ERROR nova.compute.manager [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] [ 2102.113227] env[68906]: DEBUG nova.compute.utils [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] VimFaultException {{(pid=68906) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2102.114201] env[68906]: DEBUG nova.compute.manager [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Build of instance 922d81ba-c8d2-43ba-b1c5-f2943418d6a2 was re-scheduled: A specified parameter was not correct: fileType [ 2102.114201] env[68906]: Faults: ['InvalidArgument'] {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2102.114582] env[68906]: DEBUG nova.compute.manager [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Unplugging VIFs for instance {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2102.114757] env[68906]: DEBUG nova.compute.manager [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2102.114926] env[68906]: DEBUG nova.compute.manager [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2102.115111] env[68906]: DEBUG nova.network.neutron [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2102.413971] env[68906]: DEBUG nova.network.neutron [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2102.425115] env[68906]: INFO nova.compute.manager [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Took 0.31 seconds to deallocate network for instance. [ 2102.527301] env[68906]: INFO nova.scheduler.client.report [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Deleted allocations for instance 922d81ba-c8d2-43ba-b1c5-f2943418d6a2 [ 2102.550883] env[68906]: DEBUG oslo_concurrency.lockutils [None req-90e8e710-720b-4aa7-ba38-0b17c5edf061 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Lock "922d81ba-c8d2-43ba-b1c5-f2943418d6a2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 650.641s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2102.551970] env[68906]: DEBUG oslo_concurrency.lockutils [None req-23bcf793-7882-41c0-a6ef-da7b87706b50 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Lock "922d81ba-c8d2-43ba-b1c5-f2943418d6a2" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 454.594s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2102.552200] env[68906]: DEBUG oslo_concurrency.lockutils [None req-23bcf793-7882-41c0-a6ef-da7b87706b50 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Acquiring lock "922d81ba-c8d2-43ba-b1c5-f2943418d6a2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2102.552502] env[68906]: DEBUG oslo_concurrency.lockutils [None req-23bcf793-7882-41c0-a6ef-da7b87706b50 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Lock "922d81ba-c8d2-43ba-b1c5-f2943418d6a2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2102.552601] env[68906]: DEBUG oslo_concurrency.lockutils [None req-23bcf793-7882-41c0-a6ef-da7b87706b50 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Lock "922d81ba-c8d2-43ba-b1c5-f2943418d6a2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2102.555046] env[68906]: INFO nova.compute.manager [None req-23bcf793-7882-41c0-a6ef-da7b87706b50 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Terminating instance [ 2102.556467] env[68906]: DEBUG nova.compute.manager [None req-23bcf793-7882-41c0-a6ef-da7b87706b50 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2102.556817] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-23bcf793-7882-41c0-a6ef-da7b87706b50 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2102.557165] env[68906]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-2d8c1387-fbfb-4190-80a7-805c03ffbcc6 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2102.567188] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d63d157d-0172-4061-914d-24e77d26d72a {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2102.578371] env[68906]: DEBUG nova.compute.manager [None req-8466e239-f128-4599-96da-bff01e78a993 tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: cd208e67-55a3-4c0b-ad49-abd3a700d5ef] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2102.599515] env[68906]: WARNING nova.virt.vmwareapi.vmops [None req-23bcf793-7882-41c0-a6ef-da7b87706b50 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 922d81ba-c8d2-43ba-b1c5-f2943418d6a2 could not be found. [ 2102.599718] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-23bcf793-7882-41c0-a6ef-da7b87706b50 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2102.600056] env[68906]: INFO nova.compute.manager [None req-23bcf793-7882-41c0-a6ef-da7b87706b50 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2102.600300] env[68906]: DEBUG oslo.service.loopingcall [None req-23bcf793-7882-41c0-a6ef-da7b87706b50 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2102.600555] env[68906]: DEBUG nova.compute.manager [-] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2102.600684] env[68906]: DEBUG nova.network.neutron [-] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2102.625297] env[68906]: DEBUG nova.network.neutron [-] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2102.636023] env[68906]: INFO nova.compute.manager [-] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] Took 0.03 seconds to deallocate network for instance. [ 2102.636023] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8466e239-f128-4599-96da-bff01e78a993 tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2102.636023] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8466e239-f128-4599-96da-bff01e78a993 tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2102.637214] env[68906]: INFO nova.compute.claims [None req-8466e239-f128-4599-96da-bff01e78a993 tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: cd208e67-55a3-4c0b-ad49-abd3a700d5ef] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2102.739994] env[68906]: DEBUG oslo_concurrency.lockutils [None req-23bcf793-7882-41c0-a6ef-da7b87706b50 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Lock "922d81ba-c8d2-43ba-b1c5-f2943418d6a2" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.187s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2102.740296] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "922d81ba-c8d2-43ba-b1c5-f2943418d6a2" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 279.569s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2102.740437] env[68906]: INFO nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 922d81ba-c8d2-43ba-b1c5-f2943418d6a2] During sync_power_state the instance has a pending task (deleting). Skip. [ 2102.740609] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "922d81ba-c8d2-43ba-b1c5-f2943418d6a2" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2102.825122] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7008397-8b0a-4a37-8238-66151fe13d61 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2102.832717] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cc2eb127-052d-4a7f-a174-85b930db803c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2102.862060] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a63c0ddd-fccf-465d-803e-55c96418863f {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2102.868910] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-50a3e7fd-4973-4188-8f59-324e58bf0cd7 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2102.881485] env[68906]: DEBUG nova.compute.provider_tree [None req-8466e239-f128-4599-96da-bff01e78a993 tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2102.890445] env[68906]: DEBUG nova.scheduler.client.report [None req-8466e239-f128-4599-96da-bff01e78a993 tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2102.905168] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8466e239-f128-4599-96da-bff01e78a993 tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.269s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2102.905648] env[68906]: DEBUG nova.compute.manager [None req-8466e239-f128-4599-96da-bff01e78a993 tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: cd208e67-55a3-4c0b-ad49-abd3a700d5ef] Start building networks asynchronously for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 2102.935906] env[68906]: DEBUG nova.compute.utils [None req-8466e239-f128-4599-96da-bff01e78a993 tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Using /dev/sd instead of None {{(pid=68906) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2102.937224] env[68906]: DEBUG nova.compute.manager [None req-8466e239-f128-4599-96da-bff01e78a993 tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: cd208e67-55a3-4c0b-ad49-abd3a700d5ef] Allocating IP information in the background. {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2102.937394] env[68906]: DEBUG nova.network.neutron [None req-8466e239-f128-4599-96da-bff01e78a993 tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: cd208e67-55a3-4c0b-ad49-abd3a700d5ef] allocate_for_instance() {{(pid=68906) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2102.945950] env[68906]: DEBUG nova.compute.manager [None req-8466e239-f128-4599-96da-bff01e78a993 tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: cd208e67-55a3-4c0b-ad49-abd3a700d5ef] Start building block device mappings for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 2102.994144] env[68906]: DEBUG nova.policy [None req-8466e239-f128-4599-96da-bff01e78a993 tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '58bee9a44ac942a287d360f281e25f02', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '41d6ceb682de4f6088d3b84b57ae1101', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68906) authorize /opt/stack/nova/nova/policy.py:203}} [ 2103.007634] env[68906]: DEBUG nova.compute.manager [None req-8466e239-f128-4599-96da-bff01e78a993 tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: cd208e67-55a3-4c0b-ad49-abd3a700d5ef] Start spawning the instance on the hypervisor. {{(pid=68906) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 2103.032092] env[68906]: DEBUG nova.virt.hardware [None req-8466e239-f128-4599-96da-bff01e78a993 tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T13:00:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T13:00:23Z,direct_url=,disk_format='vmdk',id=b1400c31-d33b-4e13-944f-4c645e62493e,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='1ae7bf3a375d41c6af5e7536af51ffd1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T13:00:24Z,virtual_size=,visibility=), allow threads: False {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2103.032345] env[68906]: DEBUG nova.virt.hardware [None req-8466e239-f128-4599-96da-bff01e78a993 tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Flavor limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2103.032500] env[68906]: DEBUG nova.virt.hardware [None req-8466e239-f128-4599-96da-bff01e78a993 tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Image limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2103.032679] env[68906]: DEBUG nova.virt.hardware [None req-8466e239-f128-4599-96da-bff01e78a993 tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Flavor pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2103.032825] env[68906]: DEBUG nova.virt.hardware [None req-8466e239-f128-4599-96da-bff01e78a993 tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Image pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2103.032968] env[68906]: DEBUG nova.virt.hardware [None req-8466e239-f128-4599-96da-bff01e78a993 tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2103.033195] env[68906]: DEBUG nova.virt.hardware [None req-8466e239-f128-4599-96da-bff01e78a993 tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2103.033357] env[68906]: DEBUG nova.virt.hardware [None req-8466e239-f128-4599-96da-bff01e78a993 tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2103.033524] env[68906]: DEBUG nova.virt.hardware [None req-8466e239-f128-4599-96da-bff01e78a993 tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Got 1 possible topologies {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2103.033690] env[68906]: DEBUG nova.virt.hardware [None req-8466e239-f128-4599-96da-bff01e78a993 tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2103.033930] env[68906]: DEBUG nova.virt.hardware [None req-8466e239-f128-4599-96da-bff01e78a993 tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2103.034800] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f09b0955-5e3f-458e-b5b1-0d728e08f18a {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2103.042426] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b1b80bcc-af44-47e8-9256-8f926979b356 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2103.278689] env[68906]: DEBUG nova.network.neutron [None req-8466e239-f128-4599-96da-bff01e78a993 tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: cd208e67-55a3-4c0b-ad49-abd3a700d5ef] Successfully created port: a9beb9f3-a2d0-4d9b-b31e-50ef65b8e0e4 {{(pid=68906) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2103.998579] env[68906]: DEBUG nova.network.neutron [None req-8466e239-f128-4599-96da-bff01e78a993 tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: cd208e67-55a3-4c0b-ad49-abd3a700d5ef] Successfully updated port: a9beb9f3-a2d0-4d9b-b31e-50ef65b8e0e4 {{(pid=68906) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2104.010783] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8466e239-f128-4599-96da-bff01e78a993 tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Acquiring lock "refresh_cache-cd208e67-55a3-4c0b-ad49-abd3a700d5ef" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2104.010932] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8466e239-f128-4599-96da-bff01e78a993 tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Acquired lock "refresh_cache-cd208e67-55a3-4c0b-ad49-abd3a700d5ef" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2104.011105] env[68906]: DEBUG nova.network.neutron [None req-8466e239-f128-4599-96da-bff01e78a993 tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: cd208e67-55a3-4c0b-ad49-abd3a700d5ef] Building network info cache for instance {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2104.047874] env[68906]: DEBUG nova.network.neutron [None req-8466e239-f128-4599-96da-bff01e78a993 tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: cd208e67-55a3-4c0b-ad49-abd3a700d5ef] Instance cache missing network info. {{(pid=68906) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2104.207401] env[68906]: DEBUG nova.network.neutron [None req-8466e239-f128-4599-96da-bff01e78a993 tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: cd208e67-55a3-4c0b-ad49-abd3a700d5ef] Updating instance_info_cache with network_info: [{"id": "a9beb9f3-a2d0-4d9b-b31e-50ef65b8e0e4", "address": "fa:16:3e:46:52:4e", "network": {"id": "8895f8be-c1f8-4a8b-8708-2bc0a03c5b67", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-449319980-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "41d6ceb682de4f6088d3b84b57ae1101", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a58387dd-f438-4913-af6a-fafb734cd881", "external-id": "nsx-vlan-transportzone-169", "segmentation_id": 169, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa9beb9f3-a2", "ovs_interfaceid": "a9beb9f3-a2d0-4d9b-b31e-50ef65b8e0e4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2104.218249] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8466e239-f128-4599-96da-bff01e78a993 tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Releasing lock "refresh_cache-cd208e67-55a3-4c0b-ad49-abd3a700d5ef" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2104.218557] env[68906]: DEBUG nova.compute.manager [None req-8466e239-f128-4599-96da-bff01e78a993 tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: cd208e67-55a3-4c0b-ad49-abd3a700d5ef] Instance network_info: |[{"id": "a9beb9f3-a2d0-4d9b-b31e-50ef65b8e0e4", "address": "fa:16:3e:46:52:4e", "network": {"id": "8895f8be-c1f8-4a8b-8708-2bc0a03c5b67", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-449319980-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "41d6ceb682de4f6088d3b84b57ae1101", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a58387dd-f438-4913-af6a-fafb734cd881", "external-id": "nsx-vlan-transportzone-169", "segmentation_id": 169, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa9beb9f3-a2", "ovs_interfaceid": "a9beb9f3-a2d0-4d9b-b31e-50ef65b8e0e4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2104.218943] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-8466e239-f128-4599-96da-bff01e78a993 tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: cd208e67-55a3-4c0b-ad49-abd3a700d5ef] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:46:52:4e', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'a58387dd-f438-4913-af6a-fafb734cd881', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'a9beb9f3-a2d0-4d9b-b31e-50ef65b8e0e4', 'vif_model': 'vmxnet3'}] {{(pid=68906) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2104.226683] env[68906]: DEBUG oslo.service.loopingcall [None req-8466e239-f128-4599-96da-bff01e78a993 tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2104.227149] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: cd208e67-55a3-4c0b-ad49-abd3a700d5ef] Creating VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2104.227371] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-a8b6a08c-4f70-451f-86d5-5a2c38ea9ae7 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2104.247680] env[68906]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2104.247680] env[68906]: value = "task-3475465" [ 2104.247680] env[68906]: _type = "Task" [ 2104.247680] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2104.259622] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475465, 'name': CreateVM_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2104.452180] env[68906]: DEBUG nova.compute.manager [req-4cc3e80f-fd9f-4714-8c60-4b85e96f652d req-57a24c8f-4248-4cbc-b6c3-86a44e93732c service nova] [instance: cd208e67-55a3-4c0b-ad49-abd3a700d5ef] Received event network-vif-plugged-a9beb9f3-a2d0-4d9b-b31e-50ef65b8e0e4 {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2104.452346] env[68906]: DEBUG oslo_concurrency.lockutils [req-4cc3e80f-fd9f-4714-8c60-4b85e96f652d req-57a24c8f-4248-4cbc-b6c3-86a44e93732c service nova] Acquiring lock "cd208e67-55a3-4c0b-ad49-abd3a700d5ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2104.452547] env[68906]: DEBUG oslo_concurrency.lockutils [req-4cc3e80f-fd9f-4714-8c60-4b85e96f652d req-57a24c8f-4248-4cbc-b6c3-86a44e93732c service nova] Lock "cd208e67-55a3-4c0b-ad49-abd3a700d5ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2104.452724] env[68906]: DEBUG oslo_concurrency.lockutils [req-4cc3e80f-fd9f-4714-8c60-4b85e96f652d req-57a24c8f-4248-4cbc-b6c3-86a44e93732c service nova] Lock "cd208e67-55a3-4c0b-ad49-abd3a700d5ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2104.452887] env[68906]: DEBUG nova.compute.manager [req-4cc3e80f-fd9f-4714-8c60-4b85e96f652d req-57a24c8f-4248-4cbc-b6c3-86a44e93732c service nova] [instance: cd208e67-55a3-4c0b-ad49-abd3a700d5ef] No waiting events found dispatching network-vif-plugged-a9beb9f3-a2d0-4d9b-b31e-50ef65b8e0e4 {{(pid=68906) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2104.453066] env[68906]: WARNING nova.compute.manager [req-4cc3e80f-fd9f-4714-8c60-4b85e96f652d req-57a24c8f-4248-4cbc-b6c3-86a44e93732c service nova] [instance: cd208e67-55a3-4c0b-ad49-abd3a700d5ef] Received unexpected event network-vif-plugged-a9beb9f3-a2d0-4d9b-b31e-50ef65b8e0e4 for instance with vm_state building and task_state spawning. [ 2104.453231] env[68906]: DEBUG nova.compute.manager [req-4cc3e80f-fd9f-4714-8c60-4b85e96f652d req-57a24c8f-4248-4cbc-b6c3-86a44e93732c service nova] [instance: cd208e67-55a3-4c0b-ad49-abd3a700d5ef] Received event network-changed-a9beb9f3-a2d0-4d9b-b31e-50ef65b8e0e4 {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2104.453381] env[68906]: DEBUG nova.compute.manager [req-4cc3e80f-fd9f-4714-8c60-4b85e96f652d req-57a24c8f-4248-4cbc-b6c3-86a44e93732c service nova] [instance: cd208e67-55a3-4c0b-ad49-abd3a700d5ef] Refreshing instance network info cache due to event network-changed-a9beb9f3-a2d0-4d9b-b31e-50ef65b8e0e4. {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 2104.453591] env[68906]: DEBUG oslo_concurrency.lockutils [req-4cc3e80f-fd9f-4714-8c60-4b85e96f652d req-57a24c8f-4248-4cbc-b6c3-86a44e93732c service nova] Acquiring lock "refresh_cache-cd208e67-55a3-4c0b-ad49-abd3a700d5ef" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2104.453739] env[68906]: DEBUG oslo_concurrency.lockutils [req-4cc3e80f-fd9f-4714-8c60-4b85e96f652d req-57a24c8f-4248-4cbc-b6c3-86a44e93732c service nova] Acquired lock "refresh_cache-cd208e67-55a3-4c0b-ad49-abd3a700d5ef" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2104.453893] env[68906]: DEBUG nova.network.neutron [req-4cc3e80f-fd9f-4714-8c60-4b85e96f652d req-57a24c8f-4248-4cbc-b6c3-86a44e93732c service nova] [instance: cd208e67-55a3-4c0b-ad49-abd3a700d5ef] Refreshing network info cache for port a9beb9f3-a2d0-4d9b-b31e-50ef65b8e0e4 {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2104.687686] env[68906]: DEBUG nova.network.neutron [req-4cc3e80f-fd9f-4714-8c60-4b85e96f652d req-57a24c8f-4248-4cbc-b6c3-86a44e93732c service nova] [instance: cd208e67-55a3-4c0b-ad49-abd3a700d5ef] Updated VIF entry in instance network info cache for port a9beb9f3-a2d0-4d9b-b31e-50ef65b8e0e4. {{(pid=68906) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2104.688114] env[68906]: DEBUG nova.network.neutron [req-4cc3e80f-fd9f-4714-8c60-4b85e96f652d req-57a24c8f-4248-4cbc-b6c3-86a44e93732c service nova] [instance: cd208e67-55a3-4c0b-ad49-abd3a700d5ef] Updating instance_info_cache with network_info: [{"id": "a9beb9f3-a2d0-4d9b-b31e-50ef65b8e0e4", "address": "fa:16:3e:46:52:4e", "network": {"id": "8895f8be-c1f8-4a8b-8708-2bc0a03c5b67", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-449319980-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "41d6ceb682de4f6088d3b84b57ae1101", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a58387dd-f438-4913-af6a-fafb734cd881", "external-id": "nsx-vlan-transportzone-169", "segmentation_id": 169, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa9beb9f3-a2", "ovs_interfaceid": "a9beb9f3-a2d0-4d9b-b31e-50ef65b8e0e4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2104.696951] env[68906]: DEBUG oslo_concurrency.lockutils [req-4cc3e80f-fd9f-4714-8c60-4b85e96f652d req-57a24c8f-4248-4cbc-b6c3-86a44e93732c service nova] Releasing lock "refresh_cache-cd208e67-55a3-4c0b-ad49-abd3a700d5ef" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2104.757298] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475465, 'name': CreateVM_Task, 'duration_secs': 0.282462} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2104.757493] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: cd208e67-55a3-4c0b-ad49-abd3a700d5ef] Created VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2104.758124] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8466e239-f128-4599-96da-bff01e78a993 tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2104.758289] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8466e239-f128-4599-96da-bff01e78a993 tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2104.758620] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8466e239-f128-4599-96da-bff01e78a993 tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2104.758860] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4b2943c4-4152-4c5b-9a68-5f52eb8eabc7 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2104.763119] env[68906]: DEBUG oslo_vmware.api [None req-8466e239-f128-4599-96da-bff01e78a993 tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Waiting for the task: (returnval){ [ 2104.763119] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]5248a9ad-886e-ce16-8f8d-22ba64100e5e" [ 2104.763119] env[68906]: _type = "Task" [ 2104.763119] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2104.776568] env[68906]: DEBUG oslo_vmware.api [None req-8466e239-f128-4599-96da-bff01e78a993 tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]5248a9ad-886e-ce16-8f8d-22ba64100e5e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2105.140968] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2105.273458] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8466e239-f128-4599-96da-bff01e78a993 tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2105.273719] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-8466e239-f128-4599-96da-bff01e78a993 tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: cd208e67-55a3-4c0b-ad49-abd3a700d5ef] Processing image b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2105.273927] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8466e239-f128-4599-96da-bff01e78a993 tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2106.148513] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager.update_available_resource {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2106.160442] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2106.160664] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2106.160831] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2106.160984] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68906) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2106.162086] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3676af45-e4f4-40e8-9b88-53bd4d755ae3 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2106.170544] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d4380247-e026-4d65-a020-f5a058456427 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2106.186008] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9549d33-af1c-4e06-948d-bff12016baec {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2106.192461] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b2e5387b-7d22-4e5b-baa5-3c3da1a0e4a5 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2106.221521] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180930MB free_disk=93GB free_vcpus=48 pci_devices=None {{(pid=68906) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2106.221666] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2106.221855] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2106.291664] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 736db39c-e5e5-4a54-b85a-aa5c703f432e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2106.291895] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance ce6e5cd6-efb8-46d1-811d-74c084661cce actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2106.292074] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 7994d291-b4bf-48f5-ad34-c1f484d77f6e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2106.292216] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 860248ea-e77b-4ff6-af64-b75f88a31348 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2106.292339] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 3cfde5a7-3148-426c-8867-ffafb33dc95b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2106.292459] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 01b79dfa-cd20-495d-b112-8429c28b741e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2106.292579] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 8bfc91d4-b1d7-449a-8d48-0e63490fe663 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2106.292695] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance d70b039d-c8ad-4ffd-84f8-08f17cb97578 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2106.292820] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance ed276c3c-6085-427d-b3b7-86bbb8660dbc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2106.292927] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance cd208e67-55a3-4c0b-ad49-abd3a700d5ef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2106.304478] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance a4f1c6a3-c189-4e3a-8ac9-ac6ec8b95723 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 2106.304701] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2106.304848] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2106.432876] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fa283b73-b872-4381-ad68-1fc0a863dead {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2106.440407] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e447dfb0-e57c-4472-a423-90f78896ddc5 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2106.471142] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-92ec56c0-332d-4df4-8914-14d45942fcc2 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2106.477933] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2660631d-d48d-482a-8f12-51310cb0c096 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2106.490797] env[68906]: DEBUG nova.compute.provider_tree [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2106.499963] env[68906]: DEBUG nova.scheduler.client.report [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2106.524358] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68906) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2106.524555] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.303s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2107.512070] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2114.141423] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2114.141714] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Cleaning up deleted instances {{(pid=68906) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 2114.151687] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] There are 0 instances to clean {{(pid=68906) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 2148.152024] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2151.006013] env[68906]: WARNING oslo_vmware.rw_handles [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2151.006013] env[68906]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2151.006013] env[68906]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2151.006013] env[68906]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2151.006013] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2151.006013] env[68906]: ERROR oslo_vmware.rw_handles response.begin() [ 2151.006013] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2151.006013] env[68906]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2151.006013] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2151.006013] env[68906]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2151.006013] env[68906]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2151.006013] env[68906]: ERROR oslo_vmware.rw_handles [ 2151.006651] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Downloaded image file data b1400c31-d33b-4e13-944f-4c645e62493e to vmware_temp/24af176a-dd32-4244-a437-c10f0bcc4e08/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2151.008590] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Caching image {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2151.008839] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Copying Virtual Disk [datastore2] vmware_temp/24af176a-dd32-4244-a437-c10f0bcc4e08/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk to [datastore2] vmware_temp/24af176a-dd32-4244-a437-c10f0bcc4e08/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk {{(pid=68906) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2151.009151] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-c7639158-1914-4a2b-aace-8be6604d6b0b {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2151.016704] env[68906]: DEBUG oslo_vmware.api [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Waiting for the task: (returnval){ [ 2151.016704] env[68906]: value = "task-3475466" [ 2151.016704] env[68906]: _type = "Task" [ 2151.016704] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2151.024237] env[68906]: DEBUG oslo_vmware.api [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Task: {'id': task-3475466, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2151.526861] env[68906]: DEBUG oslo_vmware.exceptions [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Fault InvalidArgument not matched. {{(pid=68906) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2151.526861] env[68906]: DEBUG oslo_concurrency.lockutils [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2151.527263] env[68906]: ERROR nova.compute.manager [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2151.527263] env[68906]: Faults: ['InvalidArgument'] [ 2151.527263] env[68906]: ERROR nova.compute.manager [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Traceback (most recent call last): [ 2151.527263] env[68906]: ERROR nova.compute.manager [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2151.527263] env[68906]: ERROR nova.compute.manager [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] yield resources [ 2151.527263] env[68906]: ERROR nova.compute.manager [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2151.527263] env[68906]: ERROR nova.compute.manager [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] self.driver.spawn(context, instance, image_meta, [ 2151.527263] env[68906]: ERROR nova.compute.manager [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2151.527263] env[68906]: ERROR nova.compute.manager [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2151.527263] env[68906]: ERROR nova.compute.manager [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2151.527263] env[68906]: ERROR nova.compute.manager [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] self._fetch_image_if_missing(context, vi) [ 2151.527263] env[68906]: ERROR nova.compute.manager [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2151.527582] env[68906]: ERROR nova.compute.manager [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] image_cache(vi, tmp_image_ds_loc) [ 2151.527582] env[68906]: ERROR nova.compute.manager [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2151.527582] env[68906]: ERROR nova.compute.manager [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] vm_util.copy_virtual_disk( [ 2151.527582] env[68906]: ERROR nova.compute.manager [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2151.527582] env[68906]: ERROR nova.compute.manager [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] session._wait_for_task(vmdk_copy_task) [ 2151.527582] env[68906]: ERROR nova.compute.manager [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2151.527582] env[68906]: ERROR nova.compute.manager [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] return self.wait_for_task(task_ref) [ 2151.527582] env[68906]: ERROR nova.compute.manager [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2151.527582] env[68906]: ERROR nova.compute.manager [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] return evt.wait() [ 2151.527582] env[68906]: ERROR nova.compute.manager [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2151.527582] env[68906]: ERROR nova.compute.manager [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] result = hub.switch() [ 2151.527582] env[68906]: ERROR nova.compute.manager [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2151.527582] env[68906]: ERROR nova.compute.manager [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] return self.greenlet.switch() [ 2151.527876] env[68906]: ERROR nova.compute.manager [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2151.527876] env[68906]: ERROR nova.compute.manager [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] self.f(*self.args, **self.kw) [ 2151.527876] env[68906]: ERROR nova.compute.manager [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2151.527876] env[68906]: ERROR nova.compute.manager [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] raise exceptions.translate_fault(task_info.error) [ 2151.527876] env[68906]: ERROR nova.compute.manager [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2151.527876] env[68906]: ERROR nova.compute.manager [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Faults: ['InvalidArgument'] [ 2151.527876] env[68906]: ERROR nova.compute.manager [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] [ 2151.527876] env[68906]: INFO nova.compute.manager [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Terminating instance [ 2151.529825] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2151.529825] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2151.529825] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-afdf7c32-ea21-4976-8379-bdba41bee9e6 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2151.531696] env[68906]: DEBUG nova.compute.manager [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2151.531890] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2151.532599] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4034dc0d-1543-4f80-9ffc-b16e3f0c9142 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2151.539354] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Unregistering the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2151.539564] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-58a7a48e-4839-4d6b-b7e7-65f831d3ef03 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2151.541554] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2151.541726] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68906) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2151.542643] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-99a8b02e-9368-4121-97ce-fd50221a8273 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2151.547029] env[68906]: DEBUG oslo_vmware.api [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Waiting for the task: (returnval){ [ 2151.547029] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]52f77e84-2c7e-4d31-851a-6c175bb844df" [ 2151.547029] env[68906]: _type = "Task" [ 2151.547029] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2151.555720] env[68906]: DEBUG oslo_vmware.api [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]52f77e84-2c7e-4d31-851a-6c175bb844df, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2151.617891] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Unregistered the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2151.618160] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Deleting contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2151.618313] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Deleting the datastore file [datastore2] 736db39c-e5e5-4a54-b85a-aa5c703f432e {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2151.618571] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-89a4f457-3186-418d-bbfa-cc4a1cd7a560 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2151.625117] env[68906]: DEBUG oslo_vmware.api [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Waiting for the task: (returnval){ [ 2151.625117] env[68906]: value = "task-3475468" [ 2151.625117] env[68906]: _type = "Task" [ 2151.625117] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2151.632493] env[68906]: DEBUG oslo_vmware.api [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Task: {'id': task-3475468, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2152.057062] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Preparing fetch location {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2152.057450] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Creating directory with path [datastore2] vmware_temp/ffed3f6c-acd2-4e7b-9285-24e58baa3033/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2152.057526] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-afbae51b-781e-41f5-ae37-641446583c74 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2152.068017] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Created directory with path [datastore2] vmware_temp/ffed3f6c-acd2-4e7b-9285-24e58baa3033/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2152.069030] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Fetch image to [datastore2] vmware_temp/ffed3f6c-acd2-4e7b-9285-24e58baa3033/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2152.069030] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to [datastore2] vmware_temp/ffed3f6c-acd2-4e7b-9285-24e58baa3033/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2152.069212] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-efa89335-f101-4c9b-87fc-51f7b97db261 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2152.076021] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ea2d31d3-dcda-4597-8bab-23c983f2a35d {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2152.083873] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fa12a2d4-024c-48f6-b543-643034bf4375 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2152.132841] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4bc8d4c2-6dd4-4a42-a996-0cc3b7a2f706 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2152.142364] env[68906]: DEBUG oslo_vmware.api [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Task: {'id': task-3475468, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.062144} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2152.144329] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Deleted the datastore file {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2152.144604] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Deleted contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2152.144860] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2152.145130] env[68906]: INFO nova.compute.manager [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Took 0.61 seconds to destroy the instance on the hypervisor. [ 2152.147978] env[68906]: DEBUG nova.compute.claims [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Aborting claim: {{(pid=68906) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2152.148268] env[68906]: DEBUG oslo_concurrency.lockutils [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2152.148578] env[68906]: DEBUG oslo_concurrency.lockutils [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2152.152161] env[68906]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-03bb11e8-71dd-4b34-b05b-ce43a6d9fc66 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2152.176343] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2152.305110] env[68906]: DEBUG oslo_vmware.rw_handles [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ffed3f6c-acd2-4e7b-9285-24e58baa3033/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2152.366836] env[68906]: DEBUG oslo_vmware.rw_handles [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Completed reading data from the image iterator. {{(pid=68906) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2152.367062] env[68906]: DEBUG oslo_vmware.rw_handles [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ffed3f6c-acd2-4e7b-9285-24e58baa3033/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2152.387351] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8a8492fd-4f28-40ca-a5ad-45db6a8c6b4e {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2152.396209] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3beec857-e1e4-4ae7-b32d-43a6dcda7982 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2152.427761] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b77d74f5-06b4-4efa-9f7f-ed2e269f37f9 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2152.434931] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44dba267-d21e-4f8d-a3bc-e723bfc0c831 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2152.447922] env[68906]: DEBUG nova.compute.provider_tree [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2152.475596] env[68906]: DEBUG nova.scheduler.client.report [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2152.491043] env[68906]: DEBUG oslo_concurrency.lockutils [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.342s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2152.491635] env[68906]: ERROR nova.compute.manager [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2152.491635] env[68906]: Faults: ['InvalidArgument'] [ 2152.491635] env[68906]: ERROR nova.compute.manager [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Traceback (most recent call last): [ 2152.491635] env[68906]: ERROR nova.compute.manager [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2152.491635] env[68906]: ERROR nova.compute.manager [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] self.driver.spawn(context, instance, image_meta, [ 2152.491635] env[68906]: ERROR nova.compute.manager [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2152.491635] env[68906]: ERROR nova.compute.manager [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2152.491635] env[68906]: ERROR nova.compute.manager [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2152.491635] env[68906]: ERROR nova.compute.manager [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] self._fetch_image_if_missing(context, vi) [ 2152.491635] env[68906]: ERROR nova.compute.manager [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2152.491635] env[68906]: ERROR nova.compute.manager [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] image_cache(vi, tmp_image_ds_loc) [ 2152.491635] env[68906]: ERROR nova.compute.manager [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2152.491989] env[68906]: ERROR nova.compute.manager [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] vm_util.copy_virtual_disk( [ 2152.491989] env[68906]: ERROR nova.compute.manager [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2152.491989] env[68906]: ERROR nova.compute.manager [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] session._wait_for_task(vmdk_copy_task) [ 2152.491989] env[68906]: ERROR nova.compute.manager [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2152.491989] env[68906]: ERROR nova.compute.manager [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] return self.wait_for_task(task_ref) [ 2152.491989] env[68906]: ERROR nova.compute.manager [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2152.491989] env[68906]: ERROR nova.compute.manager [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] return evt.wait() [ 2152.491989] env[68906]: ERROR nova.compute.manager [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2152.491989] env[68906]: ERROR nova.compute.manager [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] result = hub.switch() [ 2152.491989] env[68906]: ERROR nova.compute.manager [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2152.491989] env[68906]: ERROR nova.compute.manager [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] return self.greenlet.switch() [ 2152.491989] env[68906]: ERROR nova.compute.manager [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2152.491989] env[68906]: ERROR nova.compute.manager [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] self.f(*self.args, **self.kw) [ 2152.492345] env[68906]: ERROR nova.compute.manager [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2152.492345] env[68906]: ERROR nova.compute.manager [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] raise exceptions.translate_fault(task_info.error) [ 2152.492345] env[68906]: ERROR nova.compute.manager [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2152.492345] env[68906]: ERROR nova.compute.manager [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Faults: ['InvalidArgument'] [ 2152.492345] env[68906]: ERROR nova.compute.manager [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] [ 2152.492471] env[68906]: DEBUG nova.compute.utils [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] VimFaultException {{(pid=68906) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2152.493962] env[68906]: DEBUG nova.compute.manager [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Build of instance 736db39c-e5e5-4a54-b85a-aa5c703f432e was re-scheduled: A specified parameter was not correct: fileType [ 2152.493962] env[68906]: Faults: ['InvalidArgument'] {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2152.494435] env[68906]: DEBUG nova.compute.manager [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Unplugging VIFs for instance {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2152.494651] env[68906]: DEBUG nova.compute.manager [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2152.494906] env[68906]: DEBUG nova.compute.manager [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2152.495146] env[68906]: DEBUG nova.network.neutron [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2152.799476] env[68906]: DEBUG nova.network.neutron [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2152.812917] env[68906]: INFO nova.compute.manager [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Took 0.32 seconds to deallocate network for instance. [ 2152.901626] env[68906]: INFO nova.scheduler.client.report [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Deleted allocations for instance 736db39c-e5e5-4a54-b85a-aa5c703f432e [ 2152.924594] env[68906]: DEBUG oslo_concurrency.lockutils [None req-5a9e0d39-3b4d-4e01-a665-7603e8d12e3f tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Lock "736db39c-e5e5-4a54-b85a-aa5c703f432e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 671.174s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2152.925721] env[68906]: DEBUG oslo_concurrency.lockutils [None req-04b74860-fe41-4e76-8019-c7a9761fa215 tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Lock "736db39c-e5e5-4a54-b85a-aa5c703f432e" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 474.616s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2152.925958] env[68906]: DEBUG oslo_concurrency.lockutils [None req-04b74860-fe41-4e76-8019-c7a9761fa215 tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Acquiring lock "736db39c-e5e5-4a54-b85a-aa5c703f432e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2152.926160] env[68906]: DEBUG oslo_concurrency.lockutils [None req-04b74860-fe41-4e76-8019-c7a9761fa215 tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Lock "736db39c-e5e5-4a54-b85a-aa5c703f432e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2152.926341] env[68906]: DEBUG oslo_concurrency.lockutils [None req-04b74860-fe41-4e76-8019-c7a9761fa215 tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Lock "736db39c-e5e5-4a54-b85a-aa5c703f432e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2152.928281] env[68906]: INFO nova.compute.manager [None req-04b74860-fe41-4e76-8019-c7a9761fa215 tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Terminating instance [ 2152.929870] env[68906]: DEBUG nova.compute.manager [None req-04b74860-fe41-4e76-8019-c7a9761fa215 tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2152.930083] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-04b74860-fe41-4e76-8019-c7a9761fa215 tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2152.930541] env[68906]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-00b6dcba-f27a-424f-9c41-12d342592d6f {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2152.940749] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0aea256c-cabf-4e68-9311-d41b520e3217 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2152.951021] env[68906]: DEBUG nova.compute.manager [None req-abb6e557-a4c5-4fae-95f1-d08eed20205b tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: a4f1c6a3-c189-4e3a-8ac9-ac6ec8b95723] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2152.972562] env[68906]: WARNING nova.virt.vmwareapi.vmops [None req-04b74860-fe41-4e76-8019-c7a9761fa215 tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 736db39c-e5e5-4a54-b85a-aa5c703f432e could not be found. [ 2152.972752] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-04b74860-fe41-4e76-8019-c7a9761fa215 tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2152.972927] env[68906]: INFO nova.compute.manager [None req-04b74860-fe41-4e76-8019-c7a9761fa215 tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2152.973191] env[68906]: DEBUG oslo.service.loopingcall [None req-04b74860-fe41-4e76-8019-c7a9761fa215 tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2152.973417] env[68906]: DEBUG nova.compute.manager [-] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2152.973514] env[68906]: DEBUG nova.network.neutron [-] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2152.998191] env[68906]: DEBUG oslo_concurrency.lockutils [None req-abb6e557-a4c5-4fae-95f1-d08eed20205b tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2152.998430] env[68906]: DEBUG oslo_concurrency.lockutils [None req-abb6e557-a4c5-4fae-95f1-d08eed20205b tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2152.999815] env[68906]: INFO nova.compute.claims [None req-abb6e557-a4c5-4fae-95f1-d08eed20205b tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: a4f1c6a3-c189-4e3a-8ac9-ac6ec8b95723] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2153.003231] env[68906]: DEBUG nova.network.neutron [-] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2153.011092] env[68906]: INFO nova.compute.manager [-] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] Took 0.04 seconds to deallocate network for instance. [ 2153.119130] env[68906]: DEBUG oslo_concurrency.lockutils [None req-04b74860-fe41-4e76-8019-c7a9761fa215 tempest-ServersNegativeTestMultiTenantJSON-1592777577 tempest-ServersNegativeTestMultiTenantJSON-1592777577-project-member] Lock "736db39c-e5e5-4a54-b85a-aa5c703f432e" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.193s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2153.120353] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "736db39c-e5e5-4a54-b85a-aa5c703f432e" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 329.949s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2153.120659] env[68906]: INFO nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 736db39c-e5e5-4a54-b85a-aa5c703f432e] During sync_power_state the instance has a pending task (deleting). Skip. [ 2153.120846] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "736db39c-e5e5-4a54-b85a-aa5c703f432e" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2153.264796] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-669f40c4-01bc-4116-ab64-0eb455e5810a {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2153.272773] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed6311bf-b52c-4cef-91b9-6d3be253b6b6 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2153.302172] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-233c39fb-5892-4318-8657-90bbd26ac6d4 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2153.309414] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2bd3a717-4108-4b2d-a695-4b914e39c5d4 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2153.322147] env[68906]: DEBUG nova.compute.provider_tree [None req-abb6e557-a4c5-4fae-95f1-d08eed20205b tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2153.330659] env[68906]: DEBUG nova.scheduler.client.report [None req-abb6e557-a4c5-4fae-95f1-d08eed20205b tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2153.345605] env[68906]: DEBUG oslo_concurrency.lockutils [None req-abb6e557-a4c5-4fae-95f1-d08eed20205b tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.347s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2153.346135] env[68906]: DEBUG nova.compute.manager [None req-abb6e557-a4c5-4fae-95f1-d08eed20205b tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: a4f1c6a3-c189-4e3a-8ac9-ac6ec8b95723] Start building networks asynchronously for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 2153.380530] env[68906]: DEBUG nova.compute.utils [None req-abb6e557-a4c5-4fae-95f1-d08eed20205b tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Using /dev/sd instead of None {{(pid=68906) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2153.383185] env[68906]: DEBUG nova.compute.manager [None req-abb6e557-a4c5-4fae-95f1-d08eed20205b tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: a4f1c6a3-c189-4e3a-8ac9-ac6ec8b95723] Allocating IP information in the background. {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2153.383355] env[68906]: DEBUG nova.network.neutron [None req-abb6e557-a4c5-4fae-95f1-d08eed20205b tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: a4f1c6a3-c189-4e3a-8ac9-ac6ec8b95723] allocate_for_instance() {{(pid=68906) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2153.393026] env[68906]: DEBUG nova.compute.manager [None req-abb6e557-a4c5-4fae-95f1-d08eed20205b tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: a4f1c6a3-c189-4e3a-8ac9-ac6ec8b95723] Start building block device mappings for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 2153.439355] env[68906]: DEBUG nova.policy [None req-abb6e557-a4c5-4fae-95f1-d08eed20205b tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fa8acbdb3f304f67ba13b02e547844d1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '35ea959a162d451db5103b94bf7da26a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68906) authorize /opt/stack/nova/nova/policy.py:203}} [ 2153.458320] env[68906]: DEBUG nova.compute.manager [None req-abb6e557-a4c5-4fae-95f1-d08eed20205b tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: a4f1c6a3-c189-4e3a-8ac9-ac6ec8b95723] Start spawning the instance on the hypervisor. {{(pid=68906) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 2153.484176] env[68906]: DEBUG nova.virt.hardware [None req-abb6e557-a4c5-4fae-95f1-d08eed20205b tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T13:00:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T13:00:23Z,direct_url=,disk_format='vmdk',id=b1400c31-d33b-4e13-944f-4c645e62493e,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='1ae7bf3a375d41c6af5e7536af51ffd1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T13:00:24Z,virtual_size=,visibility=), allow threads: False {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2153.484422] env[68906]: DEBUG nova.virt.hardware [None req-abb6e557-a4c5-4fae-95f1-d08eed20205b tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Flavor limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2153.484579] env[68906]: DEBUG nova.virt.hardware [None req-abb6e557-a4c5-4fae-95f1-d08eed20205b tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Image limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2153.484758] env[68906]: DEBUG nova.virt.hardware [None req-abb6e557-a4c5-4fae-95f1-d08eed20205b tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Flavor pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2153.484906] env[68906]: DEBUG nova.virt.hardware [None req-abb6e557-a4c5-4fae-95f1-d08eed20205b tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Image pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2153.485085] env[68906]: DEBUG nova.virt.hardware [None req-abb6e557-a4c5-4fae-95f1-d08eed20205b tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2153.485305] env[68906]: DEBUG nova.virt.hardware [None req-abb6e557-a4c5-4fae-95f1-d08eed20205b tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2153.485494] env[68906]: DEBUG nova.virt.hardware [None req-abb6e557-a4c5-4fae-95f1-d08eed20205b tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2153.485682] env[68906]: DEBUG nova.virt.hardware [None req-abb6e557-a4c5-4fae-95f1-d08eed20205b tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Got 1 possible topologies {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2153.485850] env[68906]: DEBUG nova.virt.hardware [None req-abb6e557-a4c5-4fae-95f1-d08eed20205b tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2153.486038] env[68906]: DEBUG nova.virt.hardware [None req-abb6e557-a4c5-4fae-95f1-d08eed20205b tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2153.486878] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aca68183-b42f-40dd-8b4a-d4e7fccc8300 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2153.494895] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f79ce107-8522-4efe-a720-b0241be4cd6e {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2153.737701] env[68906]: DEBUG nova.network.neutron [None req-abb6e557-a4c5-4fae-95f1-d08eed20205b tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: a4f1c6a3-c189-4e3a-8ac9-ac6ec8b95723] Successfully created port: 90f7cc48-3d56-4176-9d01-4ac93ea335e7 {{(pid=68906) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2154.275062] env[68906]: DEBUG nova.network.neutron [None req-abb6e557-a4c5-4fae-95f1-d08eed20205b tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: a4f1c6a3-c189-4e3a-8ac9-ac6ec8b95723] Successfully updated port: 90f7cc48-3d56-4176-9d01-4ac93ea335e7 {{(pid=68906) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2154.286446] env[68906]: DEBUG oslo_concurrency.lockutils [None req-abb6e557-a4c5-4fae-95f1-d08eed20205b tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Acquiring lock "refresh_cache-a4f1c6a3-c189-4e3a-8ac9-ac6ec8b95723" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2154.286591] env[68906]: DEBUG oslo_concurrency.lockutils [None req-abb6e557-a4c5-4fae-95f1-d08eed20205b tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Acquired lock "refresh_cache-a4f1c6a3-c189-4e3a-8ac9-ac6ec8b95723" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2154.286737] env[68906]: DEBUG nova.network.neutron [None req-abb6e557-a4c5-4fae-95f1-d08eed20205b tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: a4f1c6a3-c189-4e3a-8ac9-ac6ec8b95723] Building network info cache for instance {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2154.325673] env[68906]: DEBUG nova.network.neutron [None req-abb6e557-a4c5-4fae-95f1-d08eed20205b tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: a4f1c6a3-c189-4e3a-8ac9-ac6ec8b95723] Instance cache missing network info. {{(pid=68906) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2154.482170] env[68906]: DEBUG nova.network.neutron [None req-abb6e557-a4c5-4fae-95f1-d08eed20205b tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: a4f1c6a3-c189-4e3a-8ac9-ac6ec8b95723] Updating instance_info_cache with network_info: [{"id": "90f7cc48-3d56-4176-9d01-4ac93ea335e7", "address": "fa:16:3e:ca:ca:86", "network": {"id": "42998f86-911a-4af7-93b7-ffe19e2cd70c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-563323785-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "35ea959a162d451db5103b94bf7da26a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ea4fe416-47a6-4542-b59d-8c71ab4d6503", "external-id": "nsx-vlan-transportzone-369", "segmentation_id": 369, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap90f7cc48-3d", "ovs_interfaceid": "90f7cc48-3d56-4176-9d01-4ac93ea335e7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2154.494946] env[68906]: DEBUG oslo_concurrency.lockutils [None req-abb6e557-a4c5-4fae-95f1-d08eed20205b tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Releasing lock "refresh_cache-a4f1c6a3-c189-4e3a-8ac9-ac6ec8b95723" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2154.495313] env[68906]: DEBUG nova.compute.manager [None req-abb6e557-a4c5-4fae-95f1-d08eed20205b tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: a4f1c6a3-c189-4e3a-8ac9-ac6ec8b95723] Instance network_info: |[{"id": "90f7cc48-3d56-4176-9d01-4ac93ea335e7", "address": "fa:16:3e:ca:ca:86", "network": {"id": "42998f86-911a-4af7-93b7-ffe19e2cd70c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-563323785-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "35ea959a162d451db5103b94bf7da26a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ea4fe416-47a6-4542-b59d-8c71ab4d6503", "external-id": "nsx-vlan-transportzone-369", "segmentation_id": 369, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap90f7cc48-3d", "ovs_interfaceid": "90f7cc48-3d56-4176-9d01-4ac93ea335e7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2154.495815] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-abb6e557-a4c5-4fae-95f1-d08eed20205b tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: a4f1c6a3-c189-4e3a-8ac9-ac6ec8b95723] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ca:ca:86', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ea4fe416-47a6-4542-b59d-8c71ab4d6503', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '90f7cc48-3d56-4176-9d01-4ac93ea335e7', 'vif_model': 'vmxnet3'}] {{(pid=68906) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2154.504522] env[68906]: DEBUG oslo.service.loopingcall [None req-abb6e557-a4c5-4fae-95f1-d08eed20205b tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2154.505097] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a4f1c6a3-c189-4e3a-8ac9-ac6ec8b95723] Creating VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2154.505366] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-251c79a0-b5c0-4398-b6c6-167b0c60f4c4 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2154.527105] env[68906]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2154.527105] env[68906]: value = "task-3475469" [ 2154.527105] env[68906]: _type = "Task" [ 2154.527105] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2154.535049] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475469, 'name': CreateVM_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2154.839498] env[68906]: DEBUG nova.compute.manager [req-be080ed5-6b4b-4a0d-83dd-2314db5a62c6 req-0c4449fc-b010-4a9b-a691-b278f5cf4d94 service nova] [instance: a4f1c6a3-c189-4e3a-8ac9-ac6ec8b95723] Received event network-vif-plugged-90f7cc48-3d56-4176-9d01-4ac93ea335e7 {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2154.839779] env[68906]: DEBUG oslo_concurrency.lockutils [req-be080ed5-6b4b-4a0d-83dd-2314db5a62c6 req-0c4449fc-b010-4a9b-a691-b278f5cf4d94 service nova] Acquiring lock "a4f1c6a3-c189-4e3a-8ac9-ac6ec8b95723-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2154.840009] env[68906]: DEBUG oslo_concurrency.lockutils [req-be080ed5-6b4b-4a0d-83dd-2314db5a62c6 req-0c4449fc-b010-4a9b-a691-b278f5cf4d94 service nova] Lock "a4f1c6a3-c189-4e3a-8ac9-ac6ec8b95723-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2154.840188] env[68906]: DEBUG oslo_concurrency.lockutils [req-be080ed5-6b4b-4a0d-83dd-2314db5a62c6 req-0c4449fc-b010-4a9b-a691-b278f5cf4d94 service nova] Lock "a4f1c6a3-c189-4e3a-8ac9-ac6ec8b95723-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2154.840356] env[68906]: DEBUG nova.compute.manager [req-be080ed5-6b4b-4a0d-83dd-2314db5a62c6 req-0c4449fc-b010-4a9b-a691-b278f5cf4d94 service nova] [instance: a4f1c6a3-c189-4e3a-8ac9-ac6ec8b95723] No waiting events found dispatching network-vif-plugged-90f7cc48-3d56-4176-9d01-4ac93ea335e7 {{(pid=68906) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2154.840529] env[68906]: WARNING nova.compute.manager [req-be080ed5-6b4b-4a0d-83dd-2314db5a62c6 req-0c4449fc-b010-4a9b-a691-b278f5cf4d94 service nova] [instance: a4f1c6a3-c189-4e3a-8ac9-ac6ec8b95723] Received unexpected event network-vif-plugged-90f7cc48-3d56-4176-9d01-4ac93ea335e7 for instance with vm_state building and task_state spawning. [ 2154.840683] env[68906]: DEBUG nova.compute.manager [req-be080ed5-6b4b-4a0d-83dd-2314db5a62c6 req-0c4449fc-b010-4a9b-a691-b278f5cf4d94 service nova] [instance: a4f1c6a3-c189-4e3a-8ac9-ac6ec8b95723] Received event network-changed-90f7cc48-3d56-4176-9d01-4ac93ea335e7 {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2154.840834] env[68906]: DEBUG nova.compute.manager [req-be080ed5-6b4b-4a0d-83dd-2314db5a62c6 req-0c4449fc-b010-4a9b-a691-b278f5cf4d94 service nova] [instance: a4f1c6a3-c189-4e3a-8ac9-ac6ec8b95723] Refreshing instance network info cache due to event network-changed-90f7cc48-3d56-4176-9d01-4ac93ea335e7. {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 2154.841034] env[68906]: DEBUG oslo_concurrency.lockutils [req-be080ed5-6b4b-4a0d-83dd-2314db5a62c6 req-0c4449fc-b010-4a9b-a691-b278f5cf4d94 service nova] Acquiring lock "refresh_cache-a4f1c6a3-c189-4e3a-8ac9-ac6ec8b95723" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2154.841204] env[68906]: DEBUG oslo_concurrency.lockutils [req-be080ed5-6b4b-4a0d-83dd-2314db5a62c6 req-0c4449fc-b010-4a9b-a691-b278f5cf4d94 service nova] Acquired lock "refresh_cache-a4f1c6a3-c189-4e3a-8ac9-ac6ec8b95723" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2154.841367] env[68906]: DEBUG nova.network.neutron [req-be080ed5-6b4b-4a0d-83dd-2314db5a62c6 req-0c4449fc-b010-4a9b-a691-b278f5cf4d94 service nova] [instance: a4f1c6a3-c189-4e3a-8ac9-ac6ec8b95723] Refreshing network info cache for port 90f7cc48-3d56-4176-9d01-4ac93ea335e7 {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2155.039069] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475469, 'name': CreateVM_Task, 'duration_secs': 0.292497} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2155.039260] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a4f1c6a3-c189-4e3a-8ac9-ac6ec8b95723] Created VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2155.039861] env[68906]: DEBUG oslo_concurrency.lockutils [None req-abb6e557-a4c5-4fae-95f1-d08eed20205b tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2155.040049] env[68906]: DEBUG oslo_concurrency.lockutils [None req-abb6e557-a4c5-4fae-95f1-d08eed20205b tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2155.040382] env[68906]: DEBUG oslo_concurrency.lockutils [None req-abb6e557-a4c5-4fae-95f1-d08eed20205b tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2155.040634] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8dccc410-db12-45fd-afb1-4ff5c1b6e84b {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2155.044967] env[68906]: DEBUG oslo_vmware.api [None req-abb6e557-a4c5-4fae-95f1-d08eed20205b tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Waiting for the task: (returnval){ [ 2155.044967] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]522b9808-bddd-2170-c4e9-639aff9e0810" [ 2155.044967] env[68906]: _type = "Task" [ 2155.044967] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2155.052101] env[68906]: DEBUG oslo_vmware.api [None req-abb6e557-a4c5-4fae-95f1-d08eed20205b tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]522b9808-bddd-2170-c4e9-639aff9e0810, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2155.076089] env[68906]: DEBUG nova.network.neutron [req-be080ed5-6b4b-4a0d-83dd-2314db5a62c6 req-0c4449fc-b010-4a9b-a691-b278f5cf4d94 service nova] [instance: a4f1c6a3-c189-4e3a-8ac9-ac6ec8b95723] Updated VIF entry in instance network info cache for port 90f7cc48-3d56-4176-9d01-4ac93ea335e7. {{(pid=68906) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2155.076422] env[68906]: DEBUG nova.network.neutron [req-be080ed5-6b4b-4a0d-83dd-2314db5a62c6 req-0c4449fc-b010-4a9b-a691-b278f5cf4d94 service nova] [instance: a4f1c6a3-c189-4e3a-8ac9-ac6ec8b95723] Updating instance_info_cache with network_info: [{"id": "90f7cc48-3d56-4176-9d01-4ac93ea335e7", "address": "fa:16:3e:ca:ca:86", "network": {"id": "42998f86-911a-4af7-93b7-ffe19e2cd70c", "bridge": "br-int", "label": "tempest-ImagesTestJSON-563323785-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "35ea959a162d451db5103b94bf7da26a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ea4fe416-47a6-4542-b59d-8c71ab4d6503", "external-id": "nsx-vlan-transportzone-369", "segmentation_id": 369, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap90f7cc48-3d", "ovs_interfaceid": "90f7cc48-3d56-4176-9d01-4ac93ea335e7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2155.085336] env[68906]: DEBUG oslo_concurrency.lockutils [req-be080ed5-6b4b-4a0d-83dd-2314db5a62c6 req-0c4449fc-b010-4a9b-a691-b278f5cf4d94 service nova] Releasing lock "refresh_cache-a4f1c6a3-c189-4e3a-8ac9-ac6ec8b95723" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2155.140760] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2155.140924] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Starting heal instance info cache {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2155.141078] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Rebuilding the list of instances to heal {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2155.162126] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2155.162284] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2155.162417] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2155.162543] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2155.162668] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2155.162792] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2155.162914] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: d70b039d-c8ad-4ffd-84f8-08f17cb97578] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2155.163045] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: ed276c3c-6085-427d-b3b7-86bbb8660dbc] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2155.163168] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: cd208e67-55a3-4c0b-ad49-abd3a700d5ef] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2155.163286] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: a4f1c6a3-c189-4e3a-8ac9-ac6ec8b95723] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2155.163406] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Didn't find any instances for network info cache update. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2155.163834] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2155.555874] env[68906]: DEBUG oslo_concurrency.lockutils [None req-abb6e557-a4c5-4fae-95f1-d08eed20205b tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2155.556200] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-abb6e557-a4c5-4fae-95f1-d08eed20205b tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: a4f1c6a3-c189-4e3a-8ac9-ac6ec8b95723] Processing image b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2155.556361] env[68906]: DEBUG oslo_concurrency.lockutils [None req-abb6e557-a4c5-4fae-95f1-d08eed20205b tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2157.140573] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2157.140883] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2159.141395] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2162.135431] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2162.157876] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2162.158085] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68906) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2167.142729] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2167.142729] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager.update_available_resource {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2167.153236] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2167.153640] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2167.153970] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2167.154247] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68906) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2167.155522] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-690fc13d-b15f-4192-95b5-a5d0d0468146 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2167.164616] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fe84c08d-4a1b-4779-8efc-38b3ddcaf02e {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2167.179281] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c0c2aca5-bb11-4da6-b7aa-92bc613faa92 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2167.185658] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f0cc7272-35c7-44bf-be46-9361a0ed2d8f {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2167.214623] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180953MB free_disk=93GB free_vcpus=48 pci_devices=None {{(pid=68906) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2167.214784] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2167.214983] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2167.298301] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance ce6e5cd6-efb8-46d1-811d-74c084661cce actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2167.298301] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 7994d291-b4bf-48f5-ad34-c1f484d77f6e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2167.298301] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 860248ea-e77b-4ff6-af64-b75f88a31348 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2167.298301] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 3cfde5a7-3148-426c-8867-ffafb33dc95b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2167.298503] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 01b79dfa-cd20-495d-b112-8429c28b741e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2167.298503] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 8bfc91d4-b1d7-449a-8d48-0e63490fe663 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2167.298643] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance d70b039d-c8ad-4ffd-84f8-08f17cb97578 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2167.298722] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance ed276c3c-6085-427d-b3b7-86bbb8660dbc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2167.298835] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance cd208e67-55a3-4c0b-ad49-abd3a700d5ef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2167.298944] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance a4f1c6a3-c189-4e3a-8ac9-ac6ec8b95723 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2167.299152] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2167.299290] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2167.423245] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-161246db-5351-4666-990d-dbb5f96db0e6 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2167.431284] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8bb24303-09a9-4244-bf15-1d744b997310 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2167.460899] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf67bd1f-4e79-43ba-8d0f-aaaedddae23a {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2167.468192] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ece4d41c-ec48-48fe-a5f9-bacd3a787413 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2167.481011] env[68906]: DEBUG nova.compute.provider_tree [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2167.501049] env[68906]: DEBUG nova.scheduler.client.report [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2167.521357] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68906) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2167.521555] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.307s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2202.285181] env[68906]: WARNING oslo_vmware.rw_handles [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2202.285181] env[68906]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2202.285181] env[68906]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2202.285181] env[68906]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2202.285181] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2202.285181] env[68906]: ERROR oslo_vmware.rw_handles response.begin() [ 2202.285181] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2202.285181] env[68906]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2202.285181] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2202.285181] env[68906]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2202.285181] env[68906]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2202.285181] env[68906]: ERROR oslo_vmware.rw_handles [ 2202.285871] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Downloaded image file data b1400c31-d33b-4e13-944f-4c645e62493e to vmware_temp/ffed3f6c-acd2-4e7b-9285-24e58baa3033/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2202.287768] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Caching image {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2202.288018] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Copying Virtual Disk [datastore2] vmware_temp/ffed3f6c-acd2-4e7b-9285-24e58baa3033/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk to [datastore2] vmware_temp/ffed3f6c-acd2-4e7b-9285-24e58baa3033/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk {{(pid=68906) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2202.288315] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-b7b066da-ab06-43b4-9cec-03099d00be1d {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2202.295785] env[68906]: DEBUG oslo_vmware.api [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Waiting for the task: (returnval){ [ 2202.295785] env[68906]: value = "task-3475470" [ 2202.295785] env[68906]: _type = "Task" [ 2202.295785] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2202.304811] env[68906]: DEBUG oslo_vmware.api [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Task: {'id': task-3475470, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2202.805728] env[68906]: DEBUG oslo_vmware.exceptions [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Fault InvalidArgument not matched. {{(pid=68906) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2202.805999] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2202.806587] env[68906]: ERROR nova.compute.manager [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2202.806587] env[68906]: Faults: ['InvalidArgument'] [ 2202.806587] env[68906]: ERROR nova.compute.manager [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Traceback (most recent call last): [ 2202.806587] env[68906]: ERROR nova.compute.manager [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2202.806587] env[68906]: ERROR nova.compute.manager [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] yield resources [ 2202.806587] env[68906]: ERROR nova.compute.manager [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2202.806587] env[68906]: ERROR nova.compute.manager [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] self.driver.spawn(context, instance, image_meta, [ 2202.806587] env[68906]: ERROR nova.compute.manager [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2202.806587] env[68906]: ERROR nova.compute.manager [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2202.806587] env[68906]: ERROR nova.compute.manager [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2202.806587] env[68906]: ERROR nova.compute.manager [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] self._fetch_image_if_missing(context, vi) [ 2202.806587] env[68906]: ERROR nova.compute.manager [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2202.806951] env[68906]: ERROR nova.compute.manager [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] image_cache(vi, tmp_image_ds_loc) [ 2202.806951] env[68906]: ERROR nova.compute.manager [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2202.806951] env[68906]: ERROR nova.compute.manager [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] vm_util.copy_virtual_disk( [ 2202.806951] env[68906]: ERROR nova.compute.manager [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2202.806951] env[68906]: ERROR nova.compute.manager [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] session._wait_for_task(vmdk_copy_task) [ 2202.806951] env[68906]: ERROR nova.compute.manager [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2202.806951] env[68906]: ERROR nova.compute.manager [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] return self.wait_for_task(task_ref) [ 2202.806951] env[68906]: ERROR nova.compute.manager [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2202.806951] env[68906]: ERROR nova.compute.manager [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] return evt.wait() [ 2202.806951] env[68906]: ERROR nova.compute.manager [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2202.806951] env[68906]: ERROR nova.compute.manager [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] result = hub.switch() [ 2202.806951] env[68906]: ERROR nova.compute.manager [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2202.806951] env[68906]: ERROR nova.compute.manager [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] return self.greenlet.switch() [ 2202.807506] env[68906]: ERROR nova.compute.manager [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2202.807506] env[68906]: ERROR nova.compute.manager [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] self.f(*self.args, **self.kw) [ 2202.807506] env[68906]: ERROR nova.compute.manager [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2202.807506] env[68906]: ERROR nova.compute.manager [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] raise exceptions.translate_fault(task_info.error) [ 2202.807506] env[68906]: ERROR nova.compute.manager [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2202.807506] env[68906]: ERROR nova.compute.manager [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Faults: ['InvalidArgument'] [ 2202.807506] env[68906]: ERROR nova.compute.manager [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] [ 2202.807506] env[68906]: INFO nova.compute.manager [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Terminating instance [ 2202.808761] env[68906]: DEBUG oslo_concurrency.lockutils [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2202.808969] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2202.809224] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a00004f4-8a36-494b-9062-88fd1eaf94bb {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2202.811538] env[68906]: DEBUG nova.compute.manager [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2202.811729] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2202.812429] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-13f063d3-2863-44d9-88f8-4d8708d89fd6 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2202.818691] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Unregistering the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2202.818892] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-d5f504b0-c244-4548-81df-e9b3241c2fc9 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2202.820871] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2202.821051] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68906) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2202.821930] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3cd57e6a-d276-47ea-985b-9ceaf6c35e9b {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2202.826559] env[68906]: DEBUG oslo_vmware.api [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Waiting for the task: (returnval){ [ 2202.826559] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]5203bd31-b3aa-5966-639d-c329d4cdf207" [ 2202.826559] env[68906]: _type = "Task" [ 2202.826559] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2202.833178] env[68906]: DEBUG oslo_vmware.api [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]5203bd31-b3aa-5966-639d-c329d4cdf207, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2202.897431] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Unregistered the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2202.897669] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Deleting contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2202.897849] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Deleting the datastore file [datastore2] ce6e5cd6-efb8-46d1-811d-74c084661cce {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2202.898129] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-f1e30c89-4bdb-4f54-ad15-ff0932b11c2e {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2202.903955] env[68906]: DEBUG oslo_vmware.api [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Waiting for the task: (returnval){ [ 2202.903955] env[68906]: value = "task-3475472" [ 2202.903955] env[68906]: _type = "Task" [ 2202.903955] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2202.911378] env[68906]: DEBUG oslo_vmware.api [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Task: {'id': task-3475472, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2203.337081] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Preparing fetch location {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2203.337396] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Creating directory with path [datastore2] vmware_temp/c01db354-1dbd-43d3-b6da-90cc5d33bb8a/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2203.337498] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-23fd979d-53e6-4b99-ae0e-04e40be237d9 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2203.348866] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Created directory with path [datastore2] vmware_temp/c01db354-1dbd-43d3-b6da-90cc5d33bb8a/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2203.349062] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Fetch image to [datastore2] vmware_temp/c01db354-1dbd-43d3-b6da-90cc5d33bb8a/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2203.349236] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to [datastore2] vmware_temp/c01db354-1dbd-43d3-b6da-90cc5d33bb8a/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2203.349920] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6cf9cb32-5c7b-454b-aafd-a0b5c2b47d93 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2203.356486] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4994f1b-d715-4618-b4d1-b03bb7deee7b {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2203.365334] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79c2f78f-e280-4823-a6d3-a4143127d1bf {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2203.395556] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48255792-d238-4b5a-ae5c-8540df75e982 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2203.400714] env[68906]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-76a9bf57-07aa-4f40-8485-1da10ed32052 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2203.412480] env[68906]: DEBUG oslo_vmware.api [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Task: {'id': task-3475472, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.075064} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2203.412718] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Deleted the datastore file {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2203.412901] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Deleted contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2203.413087] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2203.413274] env[68906]: INFO nova.compute.manager [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2203.415328] env[68906]: DEBUG nova.compute.claims [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Aborting claim: {{(pid=68906) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2203.415493] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2203.415702] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2203.424545] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2203.479255] env[68906]: DEBUG oslo_vmware.rw_handles [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c01db354-1dbd-43d3-b6da-90cc5d33bb8a/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2203.539561] env[68906]: DEBUG oslo_vmware.rw_handles [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Completed reading data from the image iterator. {{(pid=68906) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2203.539763] env[68906]: DEBUG oslo_vmware.rw_handles [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c01db354-1dbd-43d3-b6da-90cc5d33bb8a/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2203.642190] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ecf99e7-ab0f-43c6-90e1-f5946b08a70c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2203.649265] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ed705d3-2bb9-474d-ac70-8348711554d5 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2203.678317] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e15e2ed4-3ee8-4389-9e18-9b91cbe6551a {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2203.684653] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-34974d1c-40b9-4182-894c-33f5ae498007 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2203.696974] env[68906]: DEBUG nova.compute.provider_tree [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2203.705280] env[68906]: DEBUG nova.scheduler.client.report [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2203.718689] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.303s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2203.719235] env[68906]: ERROR nova.compute.manager [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2203.719235] env[68906]: Faults: ['InvalidArgument'] [ 2203.719235] env[68906]: ERROR nova.compute.manager [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Traceback (most recent call last): [ 2203.719235] env[68906]: ERROR nova.compute.manager [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2203.719235] env[68906]: ERROR nova.compute.manager [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] self.driver.spawn(context, instance, image_meta, [ 2203.719235] env[68906]: ERROR nova.compute.manager [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2203.719235] env[68906]: ERROR nova.compute.manager [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2203.719235] env[68906]: ERROR nova.compute.manager [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2203.719235] env[68906]: ERROR nova.compute.manager [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] self._fetch_image_if_missing(context, vi) [ 2203.719235] env[68906]: ERROR nova.compute.manager [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2203.719235] env[68906]: ERROR nova.compute.manager [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] image_cache(vi, tmp_image_ds_loc) [ 2203.719235] env[68906]: ERROR nova.compute.manager [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2203.719618] env[68906]: ERROR nova.compute.manager [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] vm_util.copy_virtual_disk( [ 2203.719618] env[68906]: ERROR nova.compute.manager [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2203.719618] env[68906]: ERROR nova.compute.manager [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] session._wait_for_task(vmdk_copy_task) [ 2203.719618] env[68906]: ERROR nova.compute.manager [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2203.719618] env[68906]: ERROR nova.compute.manager [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] return self.wait_for_task(task_ref) [ 2203.719618] env[68906]: ERROR nova.compute.manager [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2203.719618] env[68906]: ERROR nova.compute.manager [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] return evt.wait() [ 2203.719618] env[68906]: ERROR nova.compute.manager [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2203.719618] env[68906]: ERROR nova.compute.manager [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] result = hub.switch() [ 2203.719618] env[68906]: ERROR nova.compute.manager [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2203.719618] env[68906]: ERROR nova.compute.manager [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] return self.greenlet.switch() [ 2203.719618] env[68906]: ERROR nova.compute.manager [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2203.719618] env[68906]: ERROR nova.compute.manager [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] self.f(*self.args, **self.kw) [ 2203.719988] env[68906]: ERROR nova.compute.manager [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2203.719988] env[68906]: ERROR nova.compute.manager [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] raise exceptions.translate_fault(task_info.error) [ 2203.719988] env[68906]: ERROR nova.compute.manager [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2203.719988] env[68906]: ERROR nova.compute.manager [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Faults: ['InvalidArgument'] [ 2203.719988] env[68906]: ERROR nova.compute.manager [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] [ 2203.719988] env[68906]: DEBUG nova.compute.utils [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] VimFaultException {{(pid=68906) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2203.721293] env[68906]: DEBUG nova.compute.manager [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Build of instance ce6e5cd6-efb8-46d1-811d-74c084661cce was re-scheduled: A specified parameter was not correct: fileType [ 2203.721293] env[68906]: Faults: ['InvalidArgument'] {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2203.721662] env[68906]: DEBUG nova.compute.manager [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Unplugging VIFs for instance {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2203.721841] env[68906]: DEBUG nova.compute.manager [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2203.722012] env[68906]: DEBUG nova.compute.manager [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2203.722180] env[68906]: DEBUG nova.network.neutron [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2204.014641] env[68906]: DEBUG nova.network.neutron [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2204.026129] env[68906]: INFO nova.compute.manager [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Took 0.30 seconds to deallocate network for instance. [ 2204.119475] env[68906]: INFO nova.scheduler.client.report [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Deleted allocations for instance ce6e5cd6-efb8-46d1-811d-74c084661cce [ 2204.141992] env[68906]: DEBUG oslo_concurrency.lockutils [None req-8706cf73-54ea-418d-91c3-3d08fba88ac5 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Lock "ce6e5cd6-efb8-46d1-811d-74c084661cce" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 625.976s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2204.142288] env[68906]: DEBUG oslo_concurrency.lockutils [None req-36d3bf0a-44ab-4bd7-93e3-cdff2857f986 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Lock "ce6e5cd6-efb8-46d1-811d-74c084661cce" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 430.564s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2204.142523] env[68906]: DEBUG oslo_concurrency.lockutils [None req-36d3bf0a-44ab-4bd7-93e3-cdff2857f986 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Acquiring lock "ce6e5cd6-efb8-46d1-811d-74c084661cce-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2204.142742] env[68906]: DEBUG oslo_concurrency.lockutils [None req-36d3bf0a-44ab-4bd7-93e3-cdff2857f986 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Lock "ce6e5cd6-efb8-46d1-811d-74c084661cce-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2204.142908] env[68906]: DEBUG oslo_concurrency.lockutils [None req-36d3bf0a-44ab-4bd7-93e3-cdff2857f986 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Lock "ce6e5cd6-efb8-46d1-811d-74c084661cce-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2204.144942] env[68906]: INFO nova.compute.manager [None req-36d3bf0a-44ab-4bd7-93e3-cdff2857f986 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Terminating instance [ 2204.146833] env[68906]: DEBUG nova.compute.manager [None req-36d3bf0a-44ab-4bd7-93e3-cdff2857f986 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2204.147150] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-36d3bf0a-44ab-4bd7-93e3-cdff2857f986 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2204.147685] env[68906]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-8229bb9a-4b57-47f8-b6e0-2c49ad85ecb1 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2204.161234] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-20bec315-b3af-4f20-9a30-eaa37d3d8f42 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2204.190583] env[68906]: WARNING nova.virt.vmwareapi.vmops [None req-36d3bf0a-44ab-4bd7-93e3-cdff2857f986 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance ce6e5cd6-efb8-46d1-811d-74c084661cce could not be found. [ 2204.190798] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-36d3bf0a-44ab-4bd7-93e3-cdff2857f986 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2204.190978] env[68906]: INFO nova.compute.manager [None req-36d3bf0a-44ab-4bd7-93e3-cdff2857f986 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2204.191241] env[68906]: DEBUG oslo.service.loopingcall [None req-36d3bf0a-44ab-4bd7-93e3-cdff2857f986 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2204.191487] env[68906]: DEBUG nova.compute.manager [-] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2204.191585] env[68906]: DEBUG nova.network.neutron [-] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2204.214640] env[68906]: DEBUG nova.network.neutron [-] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2204.223025] env[68906]: INFO nova.compute.manager [-] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] Took 0.03 seconds to deallocate network for instance. [ 2204.323164] env[68906]: DEBUG oslo_concurrency.lockutils [None req-36d3bf0a-44ab-4bd7-93e3-cdff2857f986 tempest-AttachVolumeNegativeTest-681905198 tempest-AttachVolumeNegativeTest-681905198-project-member] Lock "ce6e5cd6-efb8-46d1-811d-74c084661cce" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.181s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2204.324808] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "ce6e5cd6-efb8-46d1-811d-74c084661cce" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 381.153s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2204.325010] env[68906]: INFO nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: ce6e5cd6-efb8-46d1-811d-74c084661cce] During sync_power_state the instance has a pending task (deleting). Skip. [ 2204.325200] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "ce6e5cd6-efb8-46d1-811d-74c084661cce" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2208.476417] env[68906]: DEBUG oslo_concurrency.lockutils [None req-2ecf5d37-cb5f-4b29-a868-3e14699b3e3f tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Acquiring lock "0d12bd8f-0e92-4066-9ada-6eff7b4c5dbe" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2208.476417] env[68906]: DEBUG oslo_concurrency.lockutils [None req-2ecf5d37-cb5f-4b29-a868-3e14699b3e3f tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Lock "0d12bd8f-0e92-4066-9ada-6eff7b4c5dbe" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2208.487936] env[68906]: DEBUG nova.compute.manager [None req-2ecf5d37-cb5f-4b29-a868-3e14699b3e3f tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 0d12bd8f-0e92-4066-9ada-6eff7b4c5dbe] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2208.559170] env[68906]: DEBUG oslo_concurrency.lockutils [None req-2ecf5d37-cb5f-4b29-a868-3e14699b3e3f tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2208.559498] env[68906]: DEBUG oslo_concurrency.lockutils [None req-2ecf5d37-cb5f-4b29-a868-3e14699b3e3f tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2208.561234] env[68906]: INFO nova.compute.claims [None req-2ecf5d37-cb5f-4b29-a868-3e14699b3e3f tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 0d12bd8f-0e92-4066-9ada-6eff7b4c5dbe] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2208.761757] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-273b548e-ca63-47c4-b597-42e3fe287beb {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2208.770656] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4fe7661e-3f4e-4617-8208-239afd687f2e {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2208.802339] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a4a6e6b1-a64f-4c12-b7db-b390470002f7 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2208.810326] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-240f4a44-bedf-4feb-86f5-631ce8260ca0 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2208.824319] env[68906]: DEBUG nova.compute.provider_tree [None req-2ecf5d37-cb5f-4b29-a868-3e14699b3e3f tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2208.833675] env[68906]: DEBUG nova.scheduler.client.report [None req-2ecf5d37-cb5f-4b29-a868-3e14699b3e3f tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2208.850723] env[68906]: DEBUG oslo_concurrency.lockutils [None req-2ecf5d37-cb5f-4b29-a868-3e14699b3e3f tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.291s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2208.851353] env[68906]: DEBUG nova.compute.manager [None req-2ecf5d37-cb5f-4b29-a868-3e14699b3e3f tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 0d12bd8f-0e92-4066-9ada-6eff7b4c5dbe] Start building networks asynchronously for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 2208.883872] env[68906]: DEBUG nova.compute.utils [None req-2ecf5d37-cb5f-4b29-a868-3e14699b3e3f tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Using /dev/sd instead of None {{(pid=68906) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2208.885624] env[68906]: DEBUG nova.compute.manager [None req-2ecf5d37-cb5f-4b29-a868-3e14699b3e3f tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 0d12bd8f-0e92-4066-9ada-6eff7b4c5dbe] Allocating IP information in the background. {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2208.885791] env[68906]: DEBUG nova.network.neutron [None req-2ecf5d37-cb5f-4b29-a868-3e14699b3e3f tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 0d12bd8f-0e92-4066-9ada-6eff7b4c5dbe] allocate_for_instance() {{(pid=68906) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2208.897266] env[68906]: DEBUG nova.compute.manager [None req-2ecf5d37-cb5f-4b29-a868-3e14699b3e3f tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 0d12bd8f-0e92-4066-9ada-6eff7b4c5dbe] Start building block device mappings for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 2208.960528] env[68906]: DEBUG nova.policy [None req-2ecf5d37-cb5f-4b29-a868-3e14699b3e3f tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e208107293fd4f82af1f396d43464b69', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '90f212f7916446919081fcdc0527ebb0', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68906) authorize /opt/stack/nova/nova/policy.py:203}} [ 2208.968107] env[68906]: DEBUG nova.compute.manager [None req-2ecf5d37-cb5f-4b29-a868-3e14699b3e3f tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 0d12bd8f-0e92-4066-9ada-6eff7b4c5dbe] Start spawning the instance on the hypervisor. {{(pid=68906) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 2208.994013] env[68906]: DEBUG nova.virt.hardware [None req-2ecf5d37-cb5f-4b29-a868-3e14699b3e3f tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T13:00:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T13:00:23Z,direct_url=,disk_format='vmdk',id=b1400c31-d33b-4e13-944f-4c645e62493e,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='1ae7bf3a375d41c6af5e7536af51ffd1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T13:00:24Z,virtual_size=,visibility=), allow threads: False {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2208.994344] env[68906]: DEBUG nova.virt.hardware [None req-2ecf5d37-cb5f-4b29-a868-3e14699b3e3f tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Flavor limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2208.994508] env[68906]: DEBUG nova.virt.hardware [None req-2ecf5d37-cb5f-4b29-a868-3e14699b3e3f tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Image limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2208.994693] env[68906]: DEBUG nova.virt.hardware [None req-2ecf5d37-cb5f-4b29-a868-3e14699b3e3f tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Flavor pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2208.994841] env[68906]: DEBUG nova.virt.hardware [None req-2ecf5d37-cb5f-4b29-a868-3e14699b3e3f tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Image pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2208.994991] env[68906]: DEBUG nova.virt.hardware [None req-2ecf5d37-cb5f-4b29-a868-3e14699b3e3f tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2208.995232] env[68906]: DEBUG nova.virt.hardware [None req-2ecf5d37-cb5f-4b29-a868-3e14699b3e3f tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2208.995395] env[68906]: DEBUG nova.virt.hardware [None req-2ecf5d37-cb5f-4b29-a868-3e14699b3e3f tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2208.995564] env[68906]: DEBUG nova.virt.hardware [None req-2ecf5d37-cb5f-4b29-a868-3e14699b3e3f tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Got 1 possible topologies {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2208.995729] env[68906]: DEBUG nova.virt.hardware [None req-2ecf5d37-cb5f-4b29-a868-3e14699b3e3f tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2208.995914] env[68906]: DEBUG nova.virt.hardware [None req-2ecf5d37-cb5f-4b29-a868-3e14699b3e3f tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2208.997127] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2515eefa-fb98-4102-b65d-0570918f06bc {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2209.005415] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f0fc96f-5d92-4765-b724-3c5470e1c8cb {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2209.264658] env[68906]: DEBUG nova.network.neutron [None req-2ecf5d37-cb5f-4b29-a868-3e14699b3e3f tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 0d12bd8f-0e92-4066-9ada-6eff7b4c5dbe] Successfully created port: d44fb954-6035-42af-bcf1-facce30a9efc {{(pid=68906) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2209.749235] env[68906]: DEBUG nova.compute.manager [req-798879dc-d7bf-4a5d-b071-b03f987b622c req-efca1eeb-44f6-45eb-8fcc-1ec23734d2d4 service nova] [instance: 0d12bd8f-0e92-4066-9ada-6eff7b4c5dbe] Received event network-vif-plugged-d44fb954-6035-42af-bcf1-facce30a9efc {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2209.749475] env[68906]: DEBUG oslo_concurrency.lockutils [req-798879dc-d7bf-4a5d-b071-b03f987b622c req-efca1eeb-44f6-45eb-8fcc-1ec23734d2d4 service nova] Acquiring lock "0d12bd8f-0e92-4066-9ada-6eff7b4c5dbe-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2209.749694] env[68906]: DEBUG oslo_concurrency.lockutils [req-798879dc-d7bf-4a5d-b071-b03f987b622c req-efca1eeb-44f6-45eb-8fcc-1ec23734d2d4 service nova] Lock "0d12bd8f-0e92-4066-9ada-6eff7b4c5dbe-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2209.749889] env[68906]: DEBUG oslo_concurrency.lockutils [req-798879dc-d7bf-4a5d-b071-b03f987b622c req-efca1eeb-44f6-45eb-8fcc-1ec23734d2d4 service nova] Lock "0d12bd8f-0e92-4066-9ada-6eff7b4c5dbe-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2209.750071] env[68906]: DEBUG nova.compute.manager [req-798879dc-d7bf-4a5d-b071-b03f987b622c req-efca1eeb-44f6-45eb-8fcc-1ec23734d2d4 service nova] [instance: 0d12bd8f-0e92-4066-9ada-6eff7b4c5dbe] No waiting events found dispatching network-vif-plugged-d44fb954-6035-42af-bcf1-facce30a9efc {{(pid=68906) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2209.750240] env[68906]: WARNING nova.compute.manager [req-798879dc-d7bf-4a5d-b071-b03f987b622c req-efca1eeb-44f6-45eb-8fcc-1ec23734d2d4 service nova] [instance: 0d12bd8f-0e92-4066-9ada-6eff7b4c5dbe] Received unexpected event network-vif-plugged-d44fb954-6035-42af-bcf1-facce30a9efc for instance with vm_state building and task_state spawning. [ 2209.830994] env[68906]: DEBUG nova.network.neutron [None req-2ecf5d37-cb5f-4b29-a868-3e14699b3e3f tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 0d12bd8f-0e92-4066-9ada-6eff7b4c5dbe] Successfully updated port: d44fb954-6035-42af-bcf1-facce30a9efc {{(pid=68906) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2209.841063] env[68906]: DEBUG oslo_concurrency.lockutils [None req-2ecf5d37-cb5f-4b29-a868-3e14699b3e3f tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Acquiring lock "refresh_cache-0d12bd8f-0e92-4066-9ada-6eff7b4c5dbe" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2209.841229] env[68906]: DEBUG oslo_concurrency.lockutils [None req-2ecf5d37-cb5f-4b29-a868-3e14699b3e3f tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Acquired lock "refresh_cache-0d12bd8f-0e92-4066-9ada-6eff7b4c5dbe" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2209.841384] env[68906]: DEBUG nova.network.neutron [None req-2ecf5d37-cb5f-4b29-a868-3e14699b3e3f tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 0d12bd8f-0e92-4066-9ada-6eff7b4c5dbe] Building network info cache for instance {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2209.881887] env[68906]: DEBUG nova.network.neutron [None req-2ecf5d37-cb5f-4b29-a868-3e14699b3e3f tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 0d12bd8f-0e92-4066-9ada-6eff7b4c5dbe] Instance cache missing network info. {{(pid=68906) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2210.039697] env[68906]: DEBUG nova.network.neutron [None req-2ecf5d37-cb5f-4b29-a868-3e14699b3e3f tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 0d12bd8f-0e92-4066-9ada-6eff7b4c5dbe] Updating instance_info_cache with network_info: [{"id": "d44fb954-6035-42af-bcf1-facce30a9efc", "address": "fa:16:3e:c1:e9:6d", "network": {"id": "da6ba094-8e2a-4f76-813c-8668f482685b", "bridge": "br-int", "label": "tempest-ServersTestJSON-512380607-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "90f212f7916446919081fcdc0527ebb0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0cd5d325-3053-407e-a4ee-f627e82a23f9", "external-id": "nsx-vlan-transportzone-809", "segmentation_id": 809, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd44fb954-60", "ovs_interfaceid": "d44fb954-6035-42af-bcf1-facce30a9efc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2210.053926] env[68906]: DEBUG oslo_concurrency.lockutils [None req-2ecf5d37-cb5f-4b29-a868-3e14699b3e3f tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Releasing lock "refresh_cache-0d12bd8f-0e92-4066-9ada-6eff7b4c5dbe" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2210.054217] env[68906]: DEBUG nova.compute.manager [None req-2ecf5d37-cb5f-4b29-a868-3e14699b3e3f tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 0d12bd8f-0e92-4066-9ada-6eff7b4c5dbe] Instance network_info: |[{"id": "d44fb954-6035-42af-bcf1-facce30a9efc", "address": "fa:16:3e:c1:e9:6d", "network": {"id": "da6ba094-8e2a-4f76-813c-8668f482685b", "bridge": "br-int", "label": "tempest-ServersTestJSON-512380607-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "90f212f7916446919081fcdc0527ebb0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0cd5d325-3053-407e-a4ee-f627e82a23f9", "external-id": "nsx-vlan-transportzone-809", "segmentation_id": 809, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd44fb954-60", "ovs_interfaceid": "d44fb954-6035-42af-bcf1-facce30a9efc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2210.054597] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-2ecf5d37-cb5f-4b29-a868-3e14699b3e3f tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 0d12bd8f-0e92-4066-9ada-6eff7b4c5dbe] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:c1:e9:6d', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '0cd5d325-3053-407e-a4ee-f627e82a23f9', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'd44fb954-6035-42af-bcf1-facce30a9efc', 'vif_model': 'vmxnet3'}] {{(pid=68906) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2210.062212] env[68906]: DEBUG oslo.service.loopingcall [None req-2ecf5d37-cb5f-4b29-a868-3e14699b3e3f tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2210.062637] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 0d12bd8f-0e92-4066-9ada-6eff7b4c5dbe] Creating VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2210.062859] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-286182bd-1037-44ec-8791-cbd03af09c85 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2210.082863] env[68906]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2210.082863] env[68906]: value = "task-3475473" [ 2210.082863] env[68906]: _type = "Task" [ 2210.082863] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2210.090229] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475473, 'name': CreateVM_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2210.519941] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2210.592819] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475473, 'name': CreateVM_Task, 'duration_secs': 0.310501} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2210.592966] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 0d12bd8f-0e92-4066-9ada-6eff7b4c5dbe] Created VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2210.593631] env[68906]: DEBUG oslo_concurrency.lockutils [None req-2ecf5d37-cb5f-4b29-a868-3e14699b3e3f tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2210.593796] env[68906]: DEBUG oslo_concurrency.lockutils [None req-2ecf5d37-cb5f-4b29-a868-3e14699b3e3f tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2210.594129] env[68906]: DEBUG oslo_concurrency.lockutils [None req-2ecf5d37-cb5f-4b29-a868-3e14699b3e3f tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2210.594371] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d60aeb66-7647-46af-8360-3f3606828384 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2210.598463] env[68906]: DEBUG oslo_vmware.api [None req-2ecf5d37-cb5f-4b29-a868-3e14699b3e3f tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Waiting for the task: (returnval){ [ 2210.598463] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]5212d910-53a9-bf0d-d4de-d9b5c9d78508" [ 2210.598463] env[68906]: _type = "Task" [ 2210.598463] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2210.605608] env[68906]: DEBUG oslo_vmware.api [None req-2ecf5d37-cb5f-4b29-a868-3e14699b3e3f tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]5212d910-53a9-bf0d-d4de-d9b5c9d78508, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2211.109621] env[68906]: DEBUG oslo_concurrency.lockutils [None req-2ecf5d37-cb5f-4b29-a868-3e14699b3e3f tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2211.109962] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-2ecf5d37-cb5f-4b29-a868-3e14699b3e3f tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 0d12bd8f-0e92-4066-9ada-6eff7b4c5dbe] Processing image b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2211.110084] env[68906]: DEBUG oslo_concurrency.lockutils [None req-2ecf5d37-cb5f-4b29-a868-3e14699b3e3f tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2211.774406] env[68906]: DEBUG nova.compute.manager [req-707fb578-3abf-4717-bba2-29807c0f7438 req-850139fb-5faa-43e4-a0dc-fc8e0a520b9e service nova] [instance: 0d12bd8f-0e92-4066-9ada-6eff7b4c5dbe] Received event network-changed-d44fb954-6035-42af-bcf1-facce30a9efc {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2211.774626] env[68906]: DEBUG nova.compute.manager [req-707fb578-3abf-4717-bba2-29807c0f7438 req-850139fb-5faa-43e4-a0dc-fc8e0a520b9e service nova] [instance: 0d12bd8f-0e92-4066-9ada-6eff7b4c5dbe] Refreshing instance network info cache due to event network-changed-d44fb954-6035-42af-bcf1-facce30a9efc. {{(pid=68906) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 2211.774846] env[68906]: DEBUG oslo_concurrency.lockutils [req-707fb578-3abf-4717-bba2-29807c0f7438 req-850139fb-5faa-43e4-a0dc-fc8e0a520b9e service nova] Acquiring lock "refresh_cache-0d12bd8f-0e92-4066-9ada-6eff7b4c5dbe" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2211.774992] env[68906]: DEBUG oslo_concurrency.lockutils [req-707fb578-3abf-4717-bba2-29807c0f7438 req-850139fb-5faa-43e4-a0dc-fc8e0a520b9e service nova] Acquired lock "refresh_cache-0d12bd8f-0e92-4066-9ada-6eff7b4c5dbe" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2211.775169] env[68906]: DEBUG nova.network.neutron [req-707fb578-3abf-4717-bba2-29807c0f7438 req-850139fb-5faa-43e4-a0dc-fc8e0a520b9e service nova] [instance: 0d12bd8f-0e92-4066-9ada-6eff7b4c5dbe] Refreshing network info cache for port d44fb954-6035-42af-bcf1-facce30a9efc {{(pid=68906) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2212.162813] env[68906]: DEBUG nova.network.neutron [req-707fb578-3abf-4717-bba2-29807c0f7438 req-850139fb-5faa-43e4-a0dc-fc8e0a520b9e service nova] [instance: 0d12bd8f-0e92-4066-9ada-6eff7b4c5dbe] Updated VIF entry in instance network info cache for port d44fb954-6035-42af-bcf1-facce30a9efc. {{(pid=68906) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2212.163185] env[68906]: DEBUG nova.network.neutron [req-707fb578-3abf-4717-bba2-29807c0f7438 req-850139fb-5faa-43e4-a0dc-fc8e0a520b9e service nova] [instance: 0d12bd8f-0e92-4066-9ada-6eff7b4c5dbe] Updating instance_info_cache with network_info: [{"id": "d44fb954-6035-42af-bcf1-facce30a9efc", "address": "fa:16:3e:c1:e9:6d", "network": {"id": "da6ba094-8e2a-4f76-813c-8668f482685b", "bridge": "br-int", "label": "tempest-ServersTestJSON-512380607-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "90f212f7916446919081fcdc0527ebb0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0cd5d325-3053-407e-a4ee-f627e82a23f9", "external-id": "nsx-vlan-transportzone-809", "segmentation_id": 809, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd44fb954-60", "ovs_interfaceid": "d44fb954-6035-42af-bcf1-facce30a9efc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2212.172273] env[68906]: DEBUG oslo_concurrency.lockutils [req-707fb578-3abf-4717-bba2-29807c0f7438 req-850139fb-5faa-43e4-a0dc-fc8e0a520b9e service nova] Releasing lock "refresh_cache-0d12bd8f-0e92-4066-9ada-6eff7b4c5dbe" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2215.140751] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2216.142048] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2216.142048] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Starting heal instance info cache {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2216.142048] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Rebuilding the list of instances to heal {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2216.163241] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2216.163364] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2216.163502] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2216.163632] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2216.163755] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2216.163877] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: d70b039d-c8ad-4ffd-84f8-08f17cb97578] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2216.163997] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: ed276c3c-6085-427d-b3b7-86bbb8660dbc] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2216.164132] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: cd208e67-55a3-4c0b-ad49-abd3a700d5ef] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2216.164251] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: a4f1c6a3-c189-4e3a-8ac9-ac6ec8b95723] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2216.164369] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 0d12bd8f-0e92-4066-9ada-6eff7b4c5dbe] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2216.164490] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Didn't find any instances for network info cache update. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2218.140563] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2219.140742] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2220.141014] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2224.140178] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2224.140503] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68906) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2228.135553] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2229.141408] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager.update_available_resource {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2229.155305] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2229.155537] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2229.155704] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2229.155858] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68906) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2229.157087] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef7fedd1-35c5-4f5f-aede-e1e245a83d8b {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2229.165977] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-37facb06-e4a8-4852-8538-27cc851a6c30 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2229.183629] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-546d4f56-47a1-4ed3-90c3-3b2db89bba88 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2229.189548] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a98df3c3-cadb-48d5-a2ba-c463be828594 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2229.218848] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180967MB free_disk=93GB free_vcpus=48 pci_devices=None {{(pid=68906) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2229.218983] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2229.219196] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2229.287498] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 7994d291-b4bf-48f5-ad34-c1f484d77f6e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2229.287708] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 860248ea-e77b-4ff6-af64-b75f88a31348 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2229.287848] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 3cfde5a7-3148-426c-8867-ffafb33dc95b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2229.287972] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 01b79dfa-cd20-495d-b112-8429c28b741e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2229.288111] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 8bfc91d4-b1d7-449a-8d48-0e63490fe663 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2229.288232] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance d70b039d-c8ad-4ffd-84f8-08f17cb97578 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2229.288349] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance ed276c3c-6085-427d-b3b7-86bbb8660dbc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2229.288463] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance cd208e67-55a3-4c0b-ad49-abd3a700d5ef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2229.288576] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance a4f1c6a3-c189-4e3a-8ac9-ac6ec8b95723 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2229.288690] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 0d12bd8f-0e92-4066-9ada-6eff7b4c5dbe actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2229.288876] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2229.289014] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2229.397985] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b4190a5-1187-4cec-b637-08051303e836 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2229.405858] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-235043a4-260a-455b-9e7d-f694bcf03a7d {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2229.436134] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a4d979d-9bde-4ba7-914f-ef7508438c7f {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2229.442926] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d061eff-0a31-409e-ba74-5f0d093cc4e7 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2229.455438] env[68906]: DEBUG nova.compute.provider_tree [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2229.463562] env[68906]: DEBUG nova.scheduler.client.report [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2229.475962] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68906) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2229.476156] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.257s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2240.305396] env[68906]: DEBUG oslo_concurrency.lockutils [None req-3c50595e-ba9b-475c-8f4d-71a7d92215cd tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Acquiring lock "cd208e67-55a3-4c0b-ad49-abd3a700d5ef" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2251.200176] env[68906]: WARNING oslo_vmware.rw_handles [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2251.200176] env[68906]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2251.200176] env[68906]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2251.200176] env[68906]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2251.200176] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2251.200176] env[68906]: ERROR oslo_vmware.rw_handles response.begin() [ 2251.200176] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2251.200176] env[68906]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2251.200176] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2251.200176] env[68906]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2251.200176] env[68906]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2251.200176] env[68906]: ERROR oslo_vmware.rw_handles [ 2251.201060] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Downloaded image file data b1400c31-d33b-4e13-944f-4c645e62493e to vmware_temp/c01db354-1dbd-43d3-b6da-90cc5d33bb8a/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2251.202715] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Caching image {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2251.202988] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Copying Virtual Disk [datastore2] vmware_temp/c01db354-1dbd-43d3-b6da-90cc5d33bb8a/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk to [datastore2] vmware_temp/c01db354-1dbd-43d3-b6da-90cc5d33bb8a/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk {{(pid=68906) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2251.203316] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-1d88cc14-e2c0-4bb1-a3cf-9bab2c614c17 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2251.212340] env[68906]: DEBUG oslo_vmware.api [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Waiting for the task: (returnval){ [ 2251.212340] env[68906]: value = "task-3475474" [ 2251.212340] env[68906]: _type = "Task" [ 2251.212340] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2251.220278] env[68906]: DEBUG oslo_vmware.api [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Task: {'id': task-3475474, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2251.723159] env[68906]: DEBUG oslo_vmware.exceptions [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Fault InvalidArgument not matched. {{(pid=68906) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2251.723442] env[68906]: DEBUG oslo_concurrency.lockutils [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2251.724039] env[68906]: ERROR nova.compute.manager [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2251.724039] env[68906]: Faults: ['InvalidArgument'] [ 2251.724039] env[68906]: ERROR nova.compute.manager [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Traceback (most recent call last): [ 2251.724039] env[68906]: ERROR nova.compute.manager [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2251.724039] env[68906]: ERROR nova.compute.manager [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] yield resources [ 2251.724039] env[68906]: ERROR nova.compute.manager [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2251.724039] env[68906]: ERROR nova.compute.manager [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] self.driver.spawn(context, instance, image_meta, [ 2251.724039] env[68906]: ERROR nova.compute.manager [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2251.724039] env[68906]: ERROR nova.compute.manager [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2251.724039] env[68906]: ERROR nova.compute.manager [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2251.724039] env[68906]: ERROR nova.compute.manager [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] self._fetch_image_if_missing(context, vi) [ 2251.724039] env[68906]: ERROR nova.compute.manager [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2251.724482] env[68906]: ERROR nova.compute.manager [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] image_cache(vi, tmp_image_ds_loc) [ 2251.724482] env[68906]: ERROR nova.compute.manager [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2251.724482] env[68906]: ERROR nova.compute.manager [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] vm_util.copy_virtual_disk( [ 2251.724482] env[68906]: ERROR nova.compute.manager [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2251.724482] env[68906]: ERROR nova.compute.manager [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] session._wait_for_task(vmdk_copy_task) [ 2251.724482] env[68906]: ERROR nova.compute.manager [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2251.724482] env[68906]: ERROR nova.compute.manager [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] return self.wait_for_task(task_ref) [ 2251.724482] env[68906]: ERROR nova.compute.manager [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2251.724482] env[68906]: ERROR nova.compute.manager [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] return evt.wait() [ 2251.724482] env[68906]: ERROR nova.compute.manager [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2251.724482] env[68906]: ERROR nova.compute.manager [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] result = hub.switch() [ 2251.724482] env[68906]: ERROR nova.compute.manager [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2251.724482] env[68906]: ERROR nova.compute.manager [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] return self.greenlet.switch() [ 2251.724847] env[68906]: ERROR nova.compute.manager [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2251.724847] env[68906]: ERROR nova.compute.manager [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] self.f(*self.args, **self.kw) [ 2251.724847] env[68906]: ERROR nova.compute.manager [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2251.724847] env[68906]: ERROR nova.compute.manager [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] raise exceptions.translate_fault(task_info.error) [ 2251.724847] env[68906]: ERROR nova.compute.manager [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2251.724847] env[68906]: ERROR nova.compute.manager [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Faults: ['InvalidArgument'] [ 2251.724847] env[68906]: ERROR nova.compute.manager [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] [ 2251.724847] env[68906]: INFO nova.compute.manager [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Terminating instance [ 2251.725967] env[68906]: DEBUG oslo_concurrency.lockutils [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2251.726187] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2251.726451] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0f401ce1-51bf-42d4-8193-a4744f85e4d4 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2251.730013] env[68906]: DEBUG nova.compute.manager [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2251.730237] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2251.730977] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec623fd9-756b-44c1-b7ae-2b5f2d8b8a49 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2251.738431] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Unregistering the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2251.739432] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-0a84a694-675d-4242-abb2-0f7dda9e36a5 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2251.740821] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2251.740999] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68906) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2251.741675] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-654b1d1d-b581-4462-acdf-e0a5d6835f5e {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2251.746799] env[68906]: DEBUG oslo_vmware.api [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Waiting for the task: (returnval){ [ 2251.746799] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]52ea81ef-5717-4b20-309e-75808b1499a9" [ 2251.746799] env[68906]: _type = "Task" [ 2251.746799] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2251.757452] env[68906]: DEBUG oslo_vmware.api [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]52ea81ef-5717-4b20-309e-75808b1499a9, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2251.809173] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Unregistered the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2251.809384] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Deleting contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2251.809557] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Deleting the datastore file [datastore2] 7994d291-b4bf-48f5-ad34-c1f484d77f6e {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2251.809819] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-7e1c494d-2050-4988-9cdc-e996b88e42e9 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2251.815538] env[68906]: DEBUG oslo_vmware.api [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Waiting for the task: (returnval){ [ 2251.815538] env[68906]: value = "task-3475476" [ 2251.815538] env[68906]: _type = "Task" [ 2251.815538] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2251.823060] env[68906]: DEBUG oslo_vmware.api [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Task: {'id': task-3475476, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2252.257852] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Preparing fetch location {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2252.258128] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Creating directory with path [datastore2] vmware_temp/a13e6be8-e535-42fe-bf5b-ad2fe3cd517a/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2252.258352] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ceea647e-2821-4d05-9a06-463c4e072394 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2252.268857] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Created directory with path [datastore2] vmware_temp/a13e6be8-e535-42fe-bf5b-ad2fe3cd517a/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2252.269062] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Fetch image to [datastore2] vmware_temp/a13e6be8-e535-42fe-bf5b-ad2fe3cd517a/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2252.269234] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to [datastore2] vmware_temp/a13e6be8-e535-42fe-bf5b-ad2fe3cd517a/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2252.269933] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dbbdfb0b-01f5-4cf9-9629-41997b0aaa7d {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2252.276328] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c50976e-4fba-4104-820d-3bdf66ac3c90 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2252.285123] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8642e63b-e94c-45be-8f25-6dae29522d27 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2252.315137] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-188a1e78-bcfa-424e-ab82-d64aa6175459 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2252.325492] env[68906]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-900040e3-7283-4d5a-bee8-fdaf4d2ebe6e {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2252.327138] env[68906]: DEBUG oslo_vmware.api [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Task: {'id': task-3475476, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.060843} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2252.327368] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Deleted the datastore file {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2252.327547] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Deleted contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2252.327714] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2252.327883] env[68906]: INFO nova.compute.manager [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2252.329920] env[68906]: DEBUG nova.compute.claims [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Aborting claim: {{(pid=68906) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2252.330102] env[68906]: DEBUG oslo_concurrency.lockutils [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2252.330315] env[68906]: DEBUG oslo_concurrency.lockutils [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2252.351167] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2252.401085] env[68906]: DEBUG oslo_vmware.rw_handles [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a13e6be8-e535-42fe-bf5b-ad2fe3cd517a/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2252.462276] env[68906]: DEBUG oslo_vmware.rw_handles [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Completed reading data from the image iterator. {{(pid=68906) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2252.462507] env[68906]: DEBUG oslo_vmware.rw_handles [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a13e6be8-e535-42fe-bf5b-ad2fe3cd517a/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2252.549635] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fed47440-538b-4271-a050-1b6d9684d370 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2252.557304] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-98aac790-d17f-46fc-932b-cb0b0732ee26 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2252.586704] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1fc1860f-6773-474f-b6a3-6778238e3949 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2252.593252] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-184bc3d2-7f83-4632-8dbf-812eb17990f0 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2252.605859] env[68906]: DEBUG nova.compute.provider_tree [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2252.615478] env[68906]: DEBUG nova.scheduler.client.report [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2252.629864] env[68906]: DEBUG oslo_concurrency.lockutils [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.299s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2252.630546] env[68906]: ERROR nova.compute.manager [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2252.630546] env[68906]: Faults: ['InvalidArgument'] [ 2252.630546] env[68906]: ERROR nova.compute.manager [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Traceback (most recent call last): [ 2252.630546] env[68906]: ERROR nova.compute.manager [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2252.630546] env[68906]: ERROR nova.compute.manager [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] self.driver.spawn(context, instance, image_meta, [ 2252.630546] env[68906]: ERROR nova.compute.manager [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2252.630546] env[68906]: ERROR nova.compute.manager [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2252.630546] env[68906]: ERROR nova.compute.manager [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2252.630546] env[68906]: ERROR nova.compute.manager [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] self._fetch_image_if_missing(context, vi) [ 2252.630546] env[68906]: ERROR nova.compute.manager [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2252.630546] env[68906]: ERROR nova.compute.manager [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] image_cache(vi, tmp_image_ds_loc) [ 2252.630546] env[68906]: ERROR nova.compute.manager [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2252.630869] env[68906]: ERROR nova.compute.manager [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] vm_util.copy_virtual_disk( [ 2252.630869] env[68906]: ERROR nova.compute.manager [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2252.630869] env[68906]: ERROR nova.compute.manager [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] session._wait_for_task(vmdk_copy_task) [ 2252.630869] env[68906]: ERROR nova.compute.manager [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2252.630869] env[68906]: ERROR nova.compute.manager [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] return self.wait_for_task(task_ref) [ 2252.630869] env[68906]: ERROR nova.compute.manager [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2252.630869] env[68906]: ERROR nova.compute.manager [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] return evt.wait() [ 2252.630869] env[68906]: ERROR nova.compute.manager [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2252.630869] env[68906]: ERROR nova.compute.manager [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] result = hub.switch() [ 2252.630869] env[68906]: ERROR nova.compute.manager [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2252.630869] env[68906]: ERROR nova.compute.manager [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] return self.greenlet.switch() [ 2252.630869] env[68906]: ERROR nova.compute.manager [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2252.630869] env[68906]: ERROR nova.compute.manager [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] self.f(*self.args, **self.kw) [ 2252.631180] env[68906]: ERROR nova.compute.manager [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2252.631180] env[68906]: ERROR nova.compute.manager [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] raise exceptions.translate_fault(task_info.error) [ 2252.631180] env[68906]: ERROR nova.compute.manager [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2252.631180] env[68906]: ERROR nova.compute.manager [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Faults: ['InvalidArgument'] [ 2252.631180] env[68906]: ERROR nova.compute.manager [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] [ 2252.631293] env[68906]: DEBUG nova.compute.utils [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] VimFaultException {{(pid=68906) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2252.632616] env[68906]: DEBUG nova.compute.manager [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Build of instance 7994d291-b4bf-48f5-ad34-c1f484d77f6e was re-scheduled: A specified parameter was not correct: fileType [ 2252.632616] env[68906]: Faults: ['InvalidArgument'] {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2252.633035] env[68906]: DEBUG nova.compute.manager [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Unplugging VIFs for instance {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2252.633213] env[68906]: DEBUG nova.compute.manager [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2252.633419] env[68906]: DEBUG nova.compute.manager [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2252.633542] env[68906]: DEBUG nova.network.neutron [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2253.013410] env[68906]: DEBUG nova.network.neutron [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2253.026550] env[68906]: INFO nova.compute.manager [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Took 0.39 seconds to deallocate network for instance. [ 2253.123288] env[68906]: INFO nova.scheduler.client.report [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Deleted allocations for instance 7994d291-b4bf-48f5-ad34-c1f484d77f6e [ 2253.149622] env[68906]: DEBUG oslo_concurrency.lockutils [None req-4af5cd17-ead4-446a-9184-7b7ebdecc7be tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Lock "7994d291-b4bf-48f5-ad34-c1f484d77f6e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 621.679s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2253.149622] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "7994d291-b4bf-48f5-ad34-c1f484d77f6e" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 429.978s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2253.149841] env[68906]: INFO nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] During sync_power_state the instance has a pending task (spawning). Skip. [ 2253.149880] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "7994d291-b4bf-48f5-ad34-c1f484d77f6e" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2253.150214] env[68906]: DEBUG oslo_concurrency.lockutils [None req-9f5c00ab-85af-4a9f-80ab-50bd6f212585 tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Lock "7994d291-b4bf-48f5-ad34-c1f484d77f6e" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 425.572s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2253.150356] env[68906]: DEBUG oslo_concurrency.lockutils [None req-9f5c00ab-85af-4a9f-80ab-50bd6f212585 tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Acquiring lock "7994d291-b4bf-48f5-ad34-c1f484d77f6e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2253.150542] env[68906]: DEBUG oslo_concurrency.lockutils [None req-9f5c00ab-85af-4a9f-80ab-50bd6f212585 tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Lock "7994d291-b4bf-48f5-ad34-c1f484d77f6e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2253.150701] env[68906]: DEBUG oslo_concurrency.lockutils [None req-9f5c00ab-85af-4a9f-80ab-50bd6f212585 tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Lock "7994d291-b4bf-48f5-ad34-c1f484d77f6e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2253.154891] env[68906]: INFO nova.compute.manager [None req-9f5c00ab-85af-4a9f-80ab-50bd6f212585 tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Terminating instance [ 2253.157034] env[68906]: DEBUG nova.compute.manager [None req-9f5c00ab-85af-4a9f-80ab-50bd6f212585 tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2253.157128] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-9f5c00ab-85af-4a9f-80ab-50bd6f212585 tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2253.157375] env[68906]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-cab7b602-6e83-4aa9-9d3d-907f7dca74d7 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2253.166724] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-14648870-5b9c-4377-b377-5bf8d42bc277 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2253.195655] env[68906]: WARNING nova.virt.vmwareapi.vmops [None req-9f5c00ab-85af-4a9f-80ab-50bd6f212585 tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 7994d291-b4bf-48f5-ad34-c1f484d77f6e could not be found. [ 2253.195846] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-9f5c00ab-85af-4a9f-80ab-50bd6f212585 tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2253.196033] env[68906]: INFO nova.compute.manager [None req-9f5c00ab-85af-4a9f-80ab-50bd6f212585 tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2253.196284] env[68906]: DEBUG oslo.service.loopingcall [None req-9f5c00ab-85af-4a9f-80ab-50bd6f212585 tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2253.196534] env[68906]: DEBUG nova.compute.manager [-] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2253.196638] env[68906]: DEBUG nova.network.neutron [-] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2253.218766] env[68906]: DEBUG nova.network.neutron [-] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2253.226616] env[68906]: INFO nova.compute.manager [-] [instance: 7994d291-b4bf-48f5-ad34-c1f484d77f6e] Took 0.03 seconds to deallocate network for instance. [ 2253.309642] env[68906]: DEBUG oslo_concurrency.lockutils [None req-9f5c00ab-85af-4a9f-80ab-50bd6f212585 tempest-ServerTagsTestJSON-693567183 tempest-ServerTagsTestJSON-693567183-project-member] Lock "7994d291-b4bf-48f5-ad34-c1f484d77f6e" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.159s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2268.311359] env[68906]: DEBUG oslo_concurrency.lockutils [None req-de864506-05d9-47c3-b5ab-d61369efac66 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Acquiring lock "a4f1c6a3-c189-4e3a-8ac9-ac6ec8b95723" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2272.476013] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2275.142598] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2276.140869] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2276.141078] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Starting heal instance info cache {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2276.141209] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Rebuilding the list of instances to heal {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2276.161901] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2276.162152] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2276.163039] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2276.163039] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2276.163039] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: d70b039d-c8ad-4ffd-84f8-08f17cb97578] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2276.163039] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: ed276c3c-6085-427d-b3b7-86bbb8660dbc] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2276.163039] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: cd208e67-55a3-4c0b-ad49-abd3a700d5ef] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2276.163458] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: a4f1c6a3-c189-4e3a-8ac9-ac6ec8b95723] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2276.163458] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 0d12bd8f-0e92-4066-9ada-6eff7b4c5dbe] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2276.163458] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Didn't find any instances for network info cache update. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2279.140779] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2281.141076] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2281.141496] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2284.140117] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2284.140385] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68906) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2287.137061] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2289.155739] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2291.141591] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager.update_available_resource {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2291.154247] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2291.154517] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2291.154705] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2291.154862] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68906) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2291.156210] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e8154608-c0ae-4eed-96de-40be617fc0aa {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2291.165334] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d8a5703-7a58-407d-94fa-61edbfa706aa {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2291.179886] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f285cb74-53d8-4801-a8ed-f9d388b10ead {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2291.185921] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e63eb1a-ebf8-4032-9cb2-547f52c010bb {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2291.214572] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180970MB free_disk=93GB free_vcpus=48 pci_devices=None {{(pid=68906) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2291.214731] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2291.214893] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2291.282970] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 860248ea-e77b-4ff6-af64-b75f88a31348 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2291.283158] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 3cfde5a7-3148-426c-8867-ffafb33dc95b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2291.283293] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 01b79dfa-cd20-495d-b112-8429c28b741e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2291.283418] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 8bfc91d4-b1d7-449a-8d48-0e63490fe663 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2291.283614] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance d70b039d-c8ad-4ffd-84f8-08f17cb97578 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2291.283757] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance ed276c3c-6085-427d-b3b7-86bbb8660dbc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2291.283881] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance cd208e67-55a3-4c0b-ad49-abd3a700d5ef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2291.283999] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance a4f1c6a3-c189-4e3a-8ac9-ac6ec8b95723 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2291.284134] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 0d12bd8f-0e92-4066-9ada-6eff7b4c5dbe actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2291.284334] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Total usable vcpus: 48, total allocated vcpus: 9 {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2291.284485] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1664MB phys_disk=200GB used_disk=9GB total_vcpus=48 used_vcpus=9 pci_stats=[] {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2291.394987] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28edd4cc-1688-4973-8a83-db0ed9683fe6 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2291.404012] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-17a86342-4aa1-4964-8ff7-cb1e7b6219b4 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2291.434811] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-126de4c8-87c1-470b-910c-271b4e4dd340 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2291.441848] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc8d4db7-4ed0-4006-9a14-505722eb74ee {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2291.454574] env[68906]: DEBUG nova.compute.provider_tree [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2291.463065] env[68906]: DEBUG nova.scheduler.client.report [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2291.481137] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68906) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2291.481137] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.265s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2300.634284] env[68906]: WARNING oslo_vmware.rw_handles [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2300.634284] env[68906]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2300.634284] env[68906]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2300.634284] env[68906]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2300.634284] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2300.634284] env[68906]: ERROR oslo_vmware.rw_handles response.begin() [ 2300.634284] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2300.634284] env[68906]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2300.634284] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2300.634284] env[68906]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2300.634284] env[68906]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2300.634284] env[68906]: ERROR oslo_vmware.rw_handles [ 2300.634983] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Downloaded image file data b1400c31-d33b-4e13-944f-4c645e62493e to vmware_temp/a13e6be8-e535-42fe-bf5b-ad2fe3cd517a/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2300.636904] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Caching image {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2300.637155] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Copying Virtual Disk [datastore2] vmware_temp/a13e6be8-e535-42fe-bf5b-ad2fe3cd517a/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk to [datastore2] vmware_temp/a13e6be8-e535-42fe-bf5b-ad2fe3cd517a/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk {{(pid=68906) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2300.637463] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-9d08d41c-2d13-4133-9760-9a72fc379aae {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2300.646736] env[68906]: DEBUG oslo_vmware.api [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Waiting for the task: (returnval){ [ 2300.646736] env[68906]: value = "task-3475477" [ 2300.646736] env[68906]: _type = "Task" [ 2300.646736] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2300.655339] env[68906]: DEBUG oslo_vmware.api [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Task: {'id': task-3475477, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2301.157432] env[68906]: DEBUG oslo_vmware.exceptions [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Fault InvalidArgument not matched. {{(pid=68906) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2301.157678] env[68906]: DEBUG oslo_concurrency.lockutils [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2301.158330] env[68906]: ERROR nova.compute.manager [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2301.158330] env[68906]: Faults: ['InvalidArgument'] [ 2301.158330] env[68906]: ERROR nova.compute.manager [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Traceback (most recent call last): [ 2301.158330] env[68906]: ERROR nova.compute.manager [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2301.158330] env[68906]: ERROR nova.compute.manager [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] yield resources [ 2301.158330] env[68906]: ERROR nova.compute.manager [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2301.158330] env[68906]: ERROR nova.compute.manager [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] self.driver.spawn(context, instance, image_meta, [ 2301.158330] env[68906]: ERROR nova.compute.manager [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2301.158330] env[68906]: ERROR nova.compute.manager [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2301.158330] env[68906]: ERROR nova.compute.manager [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2301.158330] env[68906]: ERROR nova.compute.manager [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] self._fetch_image_if_missing(context, vi) [ 2301.158330] env[68906]: ERROR nova.compute.manager [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2301.158759] env[68906]: ERROR nova.compute.manager [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] image_cache(vi, tmp_image_ds_loc) [ 2301.158759] env[68906]: ERROR nova.compute.manager [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2301.158759] env[68906]: ERROR nova.compute.manager [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] vm_util.copy_virtual_disk( [ 2301.158759] env[68906]: ERROR nova.compute.manager [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2301.158759] env[68906]: ERROR nova.compute.manager [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] session._wait_for_task(vmdk_copy_task) [ 2301.158759] env[68906]: ERROR nova.compute.manager [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2301.158759] env[68906]: ERROR nova.compute.manager [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] return self.wait_for_task(task_ref) [ 2301.158759] env[68906]: ERROR nova.compute.manager [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2301.158759] env[68906]: ERROR nova.compute.manager [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] return evt.wait() [ 2301.158759] env[68906]: ERROR nova.compute.manager [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2301.158759] env[68906]: ERROR nova.compute.manager [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] result = hub.switch() [ 2301.158759] env[68906]: ERROR nova.compute.manager [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2301.158759] env[68906]: ERROR nova.compute.manager [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] return self.greenlet.switch() [ 2301.159109] env[68906]: ERROR nova.compute.manager [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2301.159109] env[68906]: ERROR nova.compute.manager [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] self.f(*self.args, **self.kw) [ 2301.159109] env[68906]: ERROR nova.compute.manager [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2301.159109] env[68906]: ERROR nova.compute.manager [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] raise exceptions.translate_fault(task_info.error) [ 2301.159109] env[68906]: ERROR nova.compute.manager [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2301.159109] env[68906]: ERROR nova.compute.manager [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Faults: ['InvalidArgument'] [ 2301.159109] env[68906]: ERROR nova.compute.manager [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] [ 2301.159109] env[68906]: INFO nova.compute.manager [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Terminating instance [ 2301.160319] env[68906]: DEBUG oslo_concurrency.lockutils [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2301.160536] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2301.160790] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-82c97afb-fa80-426f-8d76-310bca6eb949 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2301.163402] env[68906]: DEBUG nova.compute.manager [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2301.163604] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2301.164416] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f6fda78b-bd8f-4c54-bda0-049943585b2b {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2301.171689] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Unregistering the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2301.171940] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-130cd442-b768-4a42-a773-62ffb8af401a {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2301.174280] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2301.174451] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68906) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2301.175440] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-552690ea-0d34-429e-ab38-ef0a8db34eaa {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2301.181247] env[68906]: DEBUG oslo_vmware.api [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Waiting for the task: (returnval){ [ 2301.181247] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]524dfa72-4fab-ee64-ebd3-620bf2f96aa9" [ 2301.181247] env[68906]: _type = "Task" [ 2301.181247] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2301.191683] env[68906]: DEBUG oslo_vmware.api [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]524dfa72-4fab-ee64-ebd3-620bf2f96aa9, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2301.249617] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Unregistered the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2301.249878] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Deleting contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2301.250075] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Deleting the datastore file [datastore2] 860248ea-e77b-4ff6-af64-b75f88a31348 {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2301.250292] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-55b6e718-bbfc-4ce5-9eb6-b0ccd23f49c7 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2301.257933] env[68906]: DEBUG oslo_vmware.api [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Waiting for the task: (returnval){ [ 2301.257933] env[68906]: value = "task-3475479" [ 2301.257933] env[68906]: _type = "Task" [ 2301.257933] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2301.266055] env[68906]: DEBUG oslo_vmware.api [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Task: {'id': task-3475479, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2301.691530] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Preparing fetch location {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2301.691870] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Creating directory with path [datastore2] vmware_temp/476f7592-8587-469f-836c-a74865978e9c/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2301.691967] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-54bf9148-7079-45b7-a1a4-484ae1824e6f {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2301.702470] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Created directory with path [datastore2] vmware_temp/476f7592-8587-469f-836c-a74865978e9c/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2301.702662] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Fetch image to [datastore2] vmware_temp/476f7592-8587-469f-836c-a74865978e9c/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2301.702834] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to [datastore2] vmware_temp/476f7592-8587-469f-836c-a74865978e9c/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2301.703546] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a6b3137-7f66-47c6-998f-ae5f03324d2c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2301.709960] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5763f671-142a-4033-925e-d7216e3a1d38 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2301.718846] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-13f2fc41-07f1-46b2-96ad-932c46c2caa8 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2301.749593] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d2b9d674-9efe-4c2d-8248-5f5afbdaf439 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2301.754654] env[68906]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-2ed380d6-8d4f-4c30-9038-14c9e25cb980 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2301.765902] env[68906]: DEBUG oslo_vmware.api [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Task: {'id': task-3475479, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.100598} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2301.766129] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Deleted the datastore file {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2301.766310] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Deleted contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2301.766518] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2301.766760] env[68906]: INFO nova.compute.manager [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2301.768719] env[68906]: DEBUG nova.compute.claims [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Aborting claim: {{(pid=68906) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2301.768885] env[68906]: DEBUG oslo_concurrency.lockutils [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2301.769119] env[68906]: DEBUG oslo_concurrency.lockutils [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2301.774228] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2301.828579] env[68906]: DEBUG oslo_vmware.rw_handles [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/476f7592-8587-469f-836c-a74865978e9c/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2301.887258] env[68906]: DEBUG oslo_vmware.rw_handles [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Completed reading data from the image iterator. {{(pid=68906) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2301.887442] env[68906]: DEBUG oslo_vmware.rw_handles [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/476f7592-8587-469f-836c-a74865978e9c/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2301.969738] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0fda60e8-3f93-45ba-8ee2-530d20e33ffa {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2301.976798] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5ca89a94-9e87-4630-8e7b-037896c8c7c5 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2302.005933] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-726e181e-7321-4b5e-a339-586de67f4d34 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2302.012668] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fbf902b5-e0b4-4940-8bd1-432b82cffbef {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2302.026298] env[68906]: DEBUG nova.compute.provider_tree [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2302.035697] env[68906]: DEBUG nova.scheduler.client.report [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2302.049604] env[68906]: DEBUG oslo_concurrency.lockutils [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.280s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2302.050135] env[68906]: ERROR nova.compute.manager [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2302.050135] env[68906]: Faults: ['InvalidArgument'] [ 2302.050135] env[68906]: ERROR nova.compute.manager [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Traceback (most recent call last): [ 2302.050135] env[68906]: ERROR nova.compute.manager [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2302.050135] env[68906]: ERROR nova.compute.manager [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] self.driver.spawn(context, instance, image_meta, [ 2302.050135] env[68906]: ERROR nova.compute.manager [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2302.050135] env[68906]: ERROR nova.compute.manager [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2302.050135] env[68906]: ERROR nova.compute.manager [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2302.050135] env[68906]: ERROR nova.compute.manager [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] self._fetch_image_if_missing(context, vi) [ 2302.050135] env[68906]: ERROR nova.compute.manager [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2302.050135] env[68906]: ERROR nova.compute.manager [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] image_cache(vi, tmp_image_ds_loc) [ 2302.050135] env[68906]: ERROR nova.compute.manager [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2302.050434] env[68906]: ERROR nova.compute.manager [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] vm_util.copy_virtual_disk( [ 2302.050434] env[68906]: ERROR nova.compute.manager [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2302.050434] env[68906]: ERROR nova.compute.manager [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] session._wait_for_task(vmdk_copy_task) [ 2302.050434] env[68906]: ERROR nova.compute.manager [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2302.050434] env[68906]: ERROR nova.compute.manager [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] return self.wait_for_task(task_ref) [ 2302.050434] env[68906]: ERROR nova.compute.manager [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2302.050434] env[68906]: ERROR nova.compute.manager [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] return evt.wait() [ 2302.050434] env[68906]: ERROR nova.compute.manager [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2302.050434] env[68906]: ERROR nova.compute.manager [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] result = hub.switch() [ 2302.050434] env[68906]: ERROR nova.compute.manager [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2302.050434] env[68906]: ERROR nova.compute.manager [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] return self.greenlet.switch() [ 2302.050434] env[68906]: ERROR nova.compute.manager [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2302.050434] env[68906]: ERROR nova.compute.manager [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] self.f(*self.args, **self.kw) [ 2302.050726] env[68906]: ERROR nova.compute.manager [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2302.050726] env[68906]: ERROR nova.compute.manager [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] raise exceptions.translate_fault(task_info.error) [ 2302.050726] env[68906]: ERROR nova.compute.manager [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2302.050726] env[68906]: ERROR nova.compute.manager [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Faults: ['InvalidArgument'] [ 2302.050726] env[68906]: ERROR nova.compute.manager [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] [ 2302.050841] env[68906]: DEBUG nova.compute.utils [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] VimFaultException {{(pid=68906) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2302.052145] env[68906]: DEBUG nova.compute.manager [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Build of instance 860248ea-e77b-4ff6-af64-b75f88a31348 was re-scheduled: A specified parameter was not correct: fileType [ 2302.052145] env[68906]: Faults: ['InvalidArgument'] {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2302.052507] env[68906]: DEBUG nova.compute.manager [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Unplugging VIFs for instance {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2302.052678] env[68906]: DEBUG nova.compute.manager [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2302.052856] env[68906]: DEBUG nova.compute.manager [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2302.053030] env[68906]: DEBUG nova.network.neutron [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2302.350288] env[68906]: DEBUG nova.network.neutron [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2302.361981] env[68906]: INFO nova.compute.manager [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Took 0.31 seconds to deallocate network for instance. [ 2302.451186] env[68906]: INFO nova.scheduler.client.report [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Deleted allocations for instance 860248ea-e77b-4ff6-af64-b75f88a31348 [ 2302.478389] env[68906]: DEBUG oslo_concurrency.lockutils [None req-08d15b06-bc1d-4202-8ca5-a913682ec4aa tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Lock "860248ea-e77b-4ff6-af64-b75f88a31348" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 649.632s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2302.478389] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "860248ea-e77b-4ff6-af64-b75f88a31348" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 479.306s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2302.478622] env[68906]: INFO nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] During sync_power_state the instance has a pending task (spawning). Skip. [ 2302.478622] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "860248ea-e77b-4ff6-af64-b75f88a31348" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2302.479113] env[68906]: DEBUG oslo_concurrency.lockutils [None req-400efce6-9f91-4c1f-9c17-29902c0c577a tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Lock "860248ea-e77b-4ff6-af64-b75f88a31348" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 454.147s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2302.479340] env[68906]: DEBUG oslo_concurrency.lockutils [None req-400efce6-9f91-4c1f-9c17-29902c0c577a tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Acquiring lock "860248ea-e77b-4ff6-af64-b75f88a31348-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2302.479541] env[68906]: DEBUG oslo_concurrency.lockutils [None req-400efce6-9f91-4c1f-9c17-29902c0c577a tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Lock "860248ea-e77b-4ff6-af64-b75f88a31348-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2302.480391] env[68906]: DEBUG oslo_concurrency.lockutils [None req-400efce6-9f91-4c1f-9c17-29902c0c577a tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Lock "860248ea-e77b-4ff6-af64-b75f88a31348-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2302.481666] env[68906]: INFO nova.compute.manager [None req-400efce6-9f91-4c1f-9c17-29902c0c577a tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Terminating instance [ 2302.486023] env[68906]: DEBUG nova.compute.manager [None req-400efce6-9f91-4c1f-9c17-29902c0c577a tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2302.486023] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-400efce6-9f91-4c1f-9c17-29902c0c577a tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2302.486438] env[68906]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-f8f34cab-0e59-47b1-a901-63584fc338e5 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2302.495642] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b02cf35-8193-4641-b6a1-fcab0da9f30c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2302.523652] env[68906]: WARNING nova.virt.vmwareapi.vmops [None req-400efce6-9f91-4c1f-9c17-29902c0c577a tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 860248ea-e77b-4ff6-af64-b75f88a31348 could not be found. [ 2302.523859] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-400efce6-9f91-4c1f-9c17-29902c0c577a tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2302.524042] env[68906]: INFO nova.compute.manager [None req-400efce6-9f91-4c1f-9c17-29902c0c577a tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2302.524298] env[68906]: DEBUG oslo.service.loopingcall [None req-400efce6-9f91-4c1f-9c17-29902c0c577a tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2302.524533] env[68906]: DEBUG nova.compute.manager [-] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2302.524627] env[68906]: DEBUG nova.network.neutron [-] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2302.548643] env[68906]: DEBUG nova.network.neutron [-] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2302.556417] env[68906]: INFO nova.compute.manager [-] [instance: 860248ea-e77b-4ff6-af64-b75f88a31348] Took 0.03 seconds to deallocate network for instance. [ 2302.653230] env[68906]: DEBUG oslo_concurrency.lockutils [None req-400efce6-9f91-4c1f-9c17-29902c0c577a tempest-ServerDiskConfigTestJSON-1909384467 tempest-ServerDiskConfigTestJSON-1909384467-project-member] Lock "860248ea-e77b-4ff6-af64-b75f88a31348" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.174s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2334.482603] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2336.140698] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2337.141983] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2337.141983] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Starting heal instance info cache {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2337.141983] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Rebuilding the list of instances to heal {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2337.162196] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2337.162424] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2337.162490] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2337.162601] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: d70b039d-c8ad-4ffd-84f8-08f17cb97578] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2337.162724] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: ed276c3c-6085-427d-b3b7-86bbb8660dbc] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2337.162846] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: cd208e67-55a3-4c0b-ad49-abd3a700d5ef] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2337.162966] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: a4f1c6a3-c189-4e3a-8ac9-ac6ec8b95723] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2337.163102] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 0d12bd8f-0e92-4066-9ada-6eff7b4c5dbe] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2337.163227] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Didn't find any instances for network info cache update. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2340.141275] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2341.140195] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2342.141853] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2346.140222] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2346.140495] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68906) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2347.334252] env[68906]: WARNING oslo_vmware.rw_handles [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2347.334252] env[68906]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2347.334252] env[68906]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2347.334252] env[68906]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2347.334252] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2347.334252] env[68906]: ERROR oslo_vmware.rw_handles response.begin() [ 2347.334252] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2347.334252] env[68906]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2347.334252] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2347.334252] env[68906]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2347.334252] env[68906]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2347.334252] env[68906]: ERROR oslo_vmware.rw_handles [ 2347.334833] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Downloaded image file data b1400c31-d33b-4e13-944f-4c645e62493e to vmware_temp/476f7592-8587-469f-836c-a74865978e9c/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2347.337094] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Caching image {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2347.337370] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Copying Virtual Disk [datastore2] vmware_temp/476f7592-8587-469f-836c-a74865978e9c/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk to [datastore2] vmware_temp/476f7592-8587-469f-836c-a74865978e9c/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk {{(pid=68906) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2347.337674] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-9f657e73-9ae7-4c90-b5b6-6c247159af99 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2347.347195] env[68906]: DEBUG oslo_vmware.api [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Waiting for the task: (returnval){ [ 2347.347195] env[68906]: value = "task-3475480" [ 2347.347195] env[68906]: _type = "Task" [ 2347.347195] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2347.355123] env[68906]: DEBUG oslo_vmware.api [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Task: {'id': task-3475480, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2347.857869] env[68906]: DEBUG oslo_vmware.exceptions [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Fault InvalidArgument not matched. {{(pid=68906) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2347.858159] env[68906]: DEBUG oslo_concurrency.lockutils [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2347.858710] env[68906]: ERROR nova.compute.manager [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2347.858710] env[68906]: Faults: ['InvalidArgument'] [ 2347.858710] env[68906]: ERROR nova.compute.manager [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Traceback (most recent call last): [ 2347.858710] env[68906]: ERROR nova.compute.manager [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2347.858710] env[68906]: ERROR nova.compute.manager [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] yield resources [ 2347.858710] env[68906]: ERROR nova.compute.manager [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2347.858710] env[68906]: ERROR nova.compute.manager [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] self.driver.spawn(context, instance, image_meta, [ 2347.858710] env[68906]: ERROR nova.compute.manager [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2347.858710] env[68906]: ERROR nova.compute.manager [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2347.858710] env[68906]: ERROR nova.compute.manager [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2347.858710] env[68906]: ERROR nova.compute.manager [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] self._fetch_image_if_missing(context, vi) [ 2347.858710] env[68906]: ERROR nova.compute.manager [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2347.858710] env[68906]: ERROR nova.compute.manager [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] image_cache(vi, tmp_image_ds_loc) [ 2347.859099] env[68906]: ERROR nova.compute.manager [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2347.859099] env[68906]: ERROR nova.compute.manager [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] vm_util.copy_virtual_disk( [ 2347.859099] env[68906]: ERROR nova.compute.manager [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2347.859099] env[68906]: ERROR nova.compute.manager [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] session._wait_for_task(vmdk_copy_task) [ 2347.859099] env[68906]: ERROR nova.compute.manager [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2347.859099] env[68906]: ERROR nova.compute.manager [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] return self.wait_for_task(task_ref) [ 2347.859099] env[68906]: ERROR nova.compute.manager [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2347.859099] env[68906]: ERROR nova.compute.manager [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] return evt.wait() [ 2347.859099] env[68906]: ERROR nova.compute.manager [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2347.859099] env[68906]: ERROR nova.compute.manager [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] result = hub.switch() [ 2347.859099] env[68906]: ERROR nova.compute.manager [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2347.859099] env[68906]: ERROR nova.compute.manager [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] return self.greenlet.switch() [ 2347.859099] env[68906]: ERROR nova.compute.manager [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2347.859416] env[68906]: ERROR nova.compute.manager [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] self.f(*self.args, **self.kw) [ 2347.859416] env[68906]: ERROR nova.compute.manager [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2347.859416] env[68906]: ERROR nova.compute.manager [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] raise exceptions.translate_fault(task_info.error) [ 2347.859416] env[68906]: ERROR nova.compute.manager [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2347.859416] env[68906]: ERROR nova.compute.manager [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Faults: ['InvalidArgument'] [ 2347.859416] env[68906]: ERROR nova.compute.manager [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] [ 2347.859416] env[68906]: INFO nova.compute.manager [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Terminating instance [ 2347.860600] env[68906]: DEBUG oslo_concurrency.lockutils [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2347.860836] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2347.861096] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-49c4f7f0-2752-4fb3-a160-b1470e58f3d1 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2347.863467] env[68906]: DEBUG nova.compute.manager [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2347.863723] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2347.864454] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-92c55b55-e82b-4304-b8b9-a96a67d6d3e2 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2347.871114] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Unregistering the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2347.871335] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-58a641c1-7f29-44d7-a399-3d29dac3c843 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2347.873449] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2347.873632] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68906) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2347.874602] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4df19912-7f43-410c-96f5-335f35a7bfc9 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2347.879616] env[68906]: DEBUG oslo_vmware.api [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Waiting for the task: (returnval){ [ 2347.879616] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]528e8f51-d445-4856-bd42-9c58f5335e41" [ 2347.879616] env[68906]: _type = "Task" [ 2347.879616] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2347.886280] env[68906]: DEBUG oslo_vmware.api [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]528e8f51-d445-4856-bd42-9c58f5335e41, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2347.960194] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Unregistered the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2347.960418] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Deleting contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2347.960592] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Deleting the datastore file [datastore2] 3cfde5a7-3148-426c-8867-ffafb33dc95b {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2347.960869] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-728022ba-d6c2-44f3-a160-18990a85f793 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2347.966963] env[68906]: DEBUG oslo_vmware.api [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Waiting for the task: (returnval){ [ 2347.966963] env[68906]: value = "task-3475482" [ 2347.966963] env[68906]: _type = "Task" [ 2347.966963] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2347.974759] env[68906]: DEBUG oslo_vmware.api [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Task: {'id': task-3475482, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2348.389981] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Preparing fetch location {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2348.390272] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Creating directory with path [datastore2] vmware_temp/9c69bc3a-9dae-4e50-8477-0265ebb04b0b/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2348.390494] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b013a012-bd03-45ad-adc0-cc4510651b08 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2348.401804] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Created directory with path [datastore2] vmware_temp/9c69bc3a-9dae-4e50-8477-0265ebb04b0b/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2348.401972] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Fetch image to [datastore2] vmware_temp/9c69bc3a-9dae-4e50-8477-0265ebb04b0b/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2348.402154] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to [datastore2] vmware_temp/9c69bc3a-9dae-4e50-8477-0265ebb04b0b/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2348.402874] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-11cb3acf-fd57-45a8-98b9-572b8ecfe4a8 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2348.409338] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-565dd221-b5ae-4393-87b1-a4db92c073b2 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2348.417949] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d707522-399f-4894-902f-1985a1713869 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2348.448688] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-00a01b4b-ca3a-4653-a2fc-c81c0f2a35b6 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2348.453944] env[68906]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-33cf7550-f677-45f4-ac80-dc54dfc4bd48 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2348.475672] env[68906]: DEBUG oslo_vmware.api [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Task: {'id': task-3475482, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.062377} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2348.477060] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Deleted the datastore file {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2348.477257] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Deleted contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2348.477430] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2348.477602] env[68906]: INFO nova.compute.manager [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Took 0.61 seconds to destroy the instance on the hypervisor. [ 2348.479341] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2348.481352] env[68906]: DEBUG nova.compute.claims [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Aborting claim: {{(pid=68906) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2348.481520] env[68906]: DEBUG oslo_concurrency.lockutils [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2348.481733] env[68906]: DEBUG oslo_concurrency.lockutils [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2348.530038] env[68906]: DEBUG oslo_vmware.rw_handles [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/9c69bc3a-9dae-4e50-8477-0265ebb04b0b/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2348.590027] env[68906]: DEBUG oslo_vmware.rw_handles [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Completed reading data from the image iterator. {{(pid=68906) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2348.590125] env[68906]: DEBUG oslo_vmware.rw_handles [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/9c69bc3a-9dae-4e50-8477-0265ebb04b0b/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2348.671417] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e6cd712-7058-4f74-a3c7-1c3a9b52bbea {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2348.678484] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e39725e-6e5c-4e7a-99d9-3ddb0d774eee {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2348.707270] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1cc7d98c-70cc-489c-84c3-4ed1dc035738 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2348.714385] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-92b27570-de05-4320-949a-03751afffbd9 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2348.727190] env[68906]: DEBUG nova.compute.provider_tree [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2348.735506] env[68906]: DEBUG nova.scheduler.client.report [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2348.748869] env[68906]: DEBUG oslo_concurrency.lockutils [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.267s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2348.749501] env[68906]: ERROR nova.compute.manager [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2348.749501] env[68906]: Faults: ['InvalidArgument'] [ 2348.749501] env[68906]: ERROR nova.compute.manager [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Traceback (most recent call last): [ 2348.749501] env[68906]: ERROR nova.compute.manager [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2348.749501] env[68906]: ERROR nova.compute.manager [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] self.driver.spawn(context, instance, image_meta, [ 2348.749501] env[68906]: ERROR nova.compute.manager [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2348.749501] env[68906]: ERROR nova.compute.manager [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2348.749501] env[68906]: ERROR nova.compute.manager [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2348.749501] env[68906]: ERROR nova.compute.manager [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] self._fetch_image_if_missing(context, vi) [ 2348.749501] env[68906]: ERROR nova.compute.manager [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2348.749501] env[68906]: ERROR nova.compute.manager [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] image_cache(vi, tmp_image_ds_loc) [ 2348.749501] env[68906]: ERROR nova.compute.manager [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2348.749815] env[68906]: ERROR nova.compute.manager [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] vm_util.copy_virtual_disk( [ 2348.749815] env[68906]: ERROR nova.compute.manager [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2348.749815] env[68906]: ERROR nova.compute.manager [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] session._wait_for_task(vmdk_copy_task) [ 2348.749815] env[68906]: ERROR nova.compute.manager [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2348.749815] env[68906]: ERROR nova.compute.manager [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] return self.wait_for_task(task_ref) [ 2348.749815] env[68906]: ERROR nova.compute.manager [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2348.749815] env[68906]: ERROR nova.compute.manager [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] return evt.wait() [ 2348.749815] env[68906]: ERROR nova.compute.manager [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2348.749815] env[68906]: ERROR nova.compute.manager [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] result = hub.switch() [ 2348.749815] env[68906]: ERROR nova.compute.manager [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2348.749815] env[68906]: ERROR nova.compute.manager [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] return self.greenlet.switch() [ 2348.749815] env[68906]: ERROR nova.compute.manager [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2348.749815] env[68906]: ERROR nova.compute.manager [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] self.f(*self.args, **self.kw) [ 2348.750199] env[68906]: ERROR nova.compute.manager [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2348.750199] env[68906]: ERROR nova.compute.manager [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] raise exceptions.translate_fault(task_info.error) [ 2348.750199] env[68906]: ERROR nova.compute.manager [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2348.750199] env[68906]: ERROR nova.compute.manager [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Faults: ['InvalidArgument'] [ 2348.750199] env[68906]: ERROR nova.compute.manager [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] [ 2348.750199] env[68906]: DEBUG nova.compute.utils [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] VimFaultException {{(pid=68906) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2348.751615] env[68906]: DEBUG nova.compute.manager [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Build of instance 3cfde5a7-3148-426c-8867-ffafb33dc95b was re-scheduled: A specified parameter was not correct: fileType [ 2348.751615] env[68906]: Faults: ['InvalidArgument'] {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2348.751987] env[68906]: DEBUG nova.compute.manager [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Unplugging VIFs for instance {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2348.752180] env[68906]: DEBUG nova.compute.manager [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2348.752357] env[68906]: DEBUG nova.compute.manager [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2348.752518] env[68906]: DEBUG nova.network.neutron [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2349.047891] env[68906]: DEBUG nova.network.neutron [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2349.062802] env[68906]: INFO nova.compute.manager [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Took 0.31 seconds to deallocate network for instance. [ 2349.146641] env[68906]: INFO nova.scheduler.client.report [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Deleted allocations for instance 3cfde5a7-3148-426c-8867-ffafb33dc95b [ 2349.169273] env[68906]: DEBUG oslo_concurrency.lockutils [None req-d367298b-38d5-456e-960c-a902d561e42c tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Lock "3cfde5a7-3148-426c-8867-ffafb33dc95b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 670.173s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2349.169535] env[68906]: DEBUG oslo_concurrency.lockutils [None req-823968b1-da09-4e0a-ab45-09a081b0e509 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Lock "3cfde5a7-3148-426c-8867-ffafb33dc95b" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 474.417s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2349.169759] env[68906]: DEBUG oslo_concurrency.lockutils [None req-823968b1-da09-4e0a-ab45-09a081b0e509 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Acquiring lock "3cfde5a7-3148-426c-8867-ffafb33dc95b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2349.169965] env[68906]: DEBUG oslo_concurrency.lockutils [None req-823968b1-da09-4e0a-ab45-09a081b0e509 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Lock "3cfde5a7-3148-426c-8867-ffafb33dc95b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2349.170149] env[68906]: DEBUG oslo_concurrency.lockutils [None req-823968b1-da09-4e0a-ab45-09a081b0e509 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Lock "3cfde5a7-3148-426c-8867-ffafb33dc95b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2349.172020] env[68906]: INFO nova.compute.manager [None req-823968b1-da09-4e0a-ab45-09a081b0e509 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Terminating instance [ 2349.173744] env[68906]: DEBUG nova.compute.manager [None req-823968b1-da09-4e0a-ab45-09a081b0e509 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2349.173938] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-823968b1-da09-4e0a-ab45-09a081b0e509 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2349.174418] env[68906]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-3c8f7f96-64ff-4403-b67e-14e105ab1444 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2349.184245] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3a6171fc-4400-4125-a60e-7a5b18e82971 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2349.226499] env[68906]: WARNING nova.virt.vmwareapi.vmops [None req-823968b1-da09-4e0a-ab45-09a081b0e509 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 3cfde5a7-3148-426c-8867-ffafb33dc95b could not be found. [ 2349.226722] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-823968b1-da09-4e0a-ab45-09a081b0e509 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2349.226902] env[68906]: INFO nova.compute.manager [None req-823968b1-da09-4e0a-ab45-09a081b0e509 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Took 0.05 seconds to destroy the instance on the hypervisor. [ 2349.227170] env[68906]: DEBUG oslo.service.loopingcall [None req-823968b1-da09-4e0a-ab45-09a081b0e509 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2349.227431] env[68906]: DEBUG nova.compute.manager [-] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2349.227534] env[68906]: DEBUG nova.network.neutron [-] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2349.250485] env[68906]: DEBUG nova.network.neutron [-] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2349.258378] env[68906]: INFO nova.compute.manager [-] [instance: 3cfde5a7-3148-426c-8867-ffafb33dc95b] Took 0.03 seconds to deallocate network for instance. [ 2349.337452] env[68906]: DEBUG oslo_concurrency.lockutils [None req-823968b1-da09-4e0a-ab45-09a081b0e509 tempest-ImagesTestJSON-1546870080 tempest-ImagesTestJSON-1546870080-project-member] Lock "3cfde5a7-3148-426c-8867-ffafb33dc95b" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.168s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2351.137047] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2352.141586] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager.update_available_resource {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2352.153140] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2352.153364] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2352.153532] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2352.153699] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68906) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2352.154822] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9581cbe4-a55a-4065-8032-603a363be34f {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2352.163381] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d94f730d-a059-4329-8eff-ac17671c538a {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2352.177781] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5f9f6573-35f6-4ce6-a511-4ac9c6dcc8b8 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2352.183644] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ffa81ac2-33db-4a60-95eb-bbb2681baa0b {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2352.211523] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180971MB free_disk=93GB free_vcpus=48 pci_devices=None {{(pid=68906) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2352.211639] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2352.212029] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2352.267229] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 01b79dfa-cd20-495d-b112-8429c28b741e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2352.267388] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 8bfc91d4-b1d7-449a-8d48-0e63490fe663 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2352.267541] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance d70b039d-c8ad-4ffd-84f8-08f17cb97578 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2352.267680] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance ed276c3c-6085-427d-b3b7-86bbb8660dbc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2352.267798] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance cd208e67-55a3-4c0b-ad49-abd3a700d5ef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2352.267915] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance a4f1c6a3-c189-4e3a-8ac9-ac6ec8b95723 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2352.268040] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 0d12bd8f-0e92-4066-9ada-6eff7b4c5dbe actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2352.268220] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Total usable vcpus: 48, total allocated vcpus: 7 {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2352.268358] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1408MB phys_disk=200GB used_disk=7GB total_vcpus=48 used_vcpus=7 pci_stats=[] {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2352.352293] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0e84e4eb-bc32-42a5-b074-72e6b8e44a9b {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2352.359512] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a5ce9f4c-bdec-4af7-829e-6499b86f5930 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2352.388999] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f6d95722-c55e-458e-87d4-818bf50fdf45 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2352.395568] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09e2f5d1-4365-4c93-9274-9a03d733de6d {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2352.407975] env[68906]: DEBUG nova.compute.provider_tree [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2352.418108] env[68906]: DEBUG nova.scheduler.client.report [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2352.430649] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68906) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2352.430828] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.219s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2394.429963] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2396.268843] env[68906]: WARNING oslo_vmware.rw_handles [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2396.268843] env[68906]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2396.268843] env[68906]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2396.268843] env[68906]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2396.268843] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2396.268843] env[68906]: ERROR oslo_vmware.rw_handles response.begin() [ 2396.268843] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2396.268843] env[68906]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2396.268843] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2396.268843] env[68906]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2396.268843] env[68906]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2396.268843] env[68906]: ERROR oslo_vmware.rw_handles [ 2396.268843] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Downloaded image file data b1400c31-d33b-4e13-944f-4c645e62493e to vmware_temp/9c69bc3a-9dae-4e50-8477-0265ebb04b0b/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2396.271503] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Caching image {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2396.271751] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Copying Virtual Disk [datastore2] vmware_temp/9c69bc3a-9dae-4e50-8477-0265ebb04b0b/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk to [datastore2] vmware_temp/9c69bc3a-9dae-4e50-8477-0265ebb04b0b/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk {{(pid=68906) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2396.272041] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-2e3c9ea9-2236-45e6-a104-6c1de492414c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2396.280128] env[68906]: DEBUG oslo_vmware.api [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Waiting for the task: (returnval){ [ 2396.280128] env[68906]: value = "task-3475483" [ 2396.280128] env[68906]: _type = "Task" [ 2396.280128] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2396.287666] env[68906]: DEBUG oslo_vmware.api [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Task: {'id': task-3475483, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2396.790767] env[68906]: DEBUG oslo_vmware.exceptions [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Fault InvalidArgument not matched. {{(pid=68906) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2396.790993] env[68906]: DEBUG oslo_concurrency.lockutils [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2396.791575] env[68906]: ERROR nova.compute.manager [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2396.791575] env[68906]: Faults: ['InvalidArgument'] [ 2396.791575] env[68906]: ERROR nova.compute.manager [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Traceback (most recent call last): [ 2396.791575] env[68906]: ERROR nova.compute.manager [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2396.791575] env[68906]: ERROR nova.compute.manager [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] yield resources [ 2396.791575] env[68906]: ERROR nova.compute.manager [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2396.791575] env[68906]: ERROR nova.compute.manager [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] self.driver.spawn(context, instance, image_meta, [ 2396.791575] env[68906]: ERROR nova.compute.manager [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2396.791575] env[68906]: ERROR nova.compute.manager [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2396.791575] env[68906]: ERROR nova.compute.manager [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2396.791575] env[68906]: ERROR nova.compute.manager [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] self._fetch_image_if_missing(context, vi) [ 2396.791575] env[68906]: ERROR nova.compute.manager [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2396.791960] env[68906]: ERROR nova.compute.manager [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] image_cache(vi, tmp_image_ds_loc) [ 2396.791960] env[68906]: ERROR nova.compute.manager [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2396.791960] env[68906]: ERROR nova.compute.manager [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] vm_util.copy_virtual_disk( [ 2396.791960] env[68906]: ERROR nova.compute.manager [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2396.791960] env[68906]: ERROR nova.compute.manager [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] session._wait_for_task(vmdk_copy_task) [ 2396.791960] env[68906]: ERROR nova.compute.manager [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2396.791960] env[68906]: ERROR nova.compute.manager [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] return self.wait_for_task(task_ref) [ 2396.791960] env[68906]: ERROR nova.compute.manager [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2396.791960] env[68906]: ERROR nova.compute.manager [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] return evt.wait() [ 2396.791960] env[68906]: ERROR nova.compute.manager [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2396.791960] env[68906]: ERROR nova.compute.manager [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] result = hub.switch() [ 2396.791960] env[68906]: ERROR nova.compute.manager [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2396.791960] env[68906]: ERROR nova.compute.manager [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] return self.greenlet.switch() [ 2396.792311] env[68906]: ERROR nova.compute.manager [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2396.792311] env[68906]: ERROR nova.compute.manager [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] self.f(*self.args, **self.kw) [ 2396.792311] env[68906]: ERROR nova.compute.manager [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2396.792311] env[68906]: ERROR nova.compute.manager [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] raise exceptions.translate_fault(task_info.error) [ 2396.792311] env[68906]: ERROR nova.compute.manager [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2396.792311] env[68906]: ERROR nova.compute.manager [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Faults: ['InvalidArgument'] [ 2396.792311] env[68906]: ERROR nova.compute.manager [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] [ 2396.792311] env[68906]: INFO nova.compute.manager [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Terminating instance [ 2396.794026] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2396.794026] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2396.794026] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c6508461-1636-48b2-aedb-3ee1a24680b6 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2396.796051] env[68906]: DEBUG nova.compute.manager [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2396.796251] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2396.796973] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-21b6d7f4-3d57-4b2c-b67f-379e176c4418 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2396.803771] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Unregistering the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2396.803972] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-70b844b7-0e9a-4f6c-8a01-f5cc76eb84a7 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2396.806064] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2396.806239] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68906) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2396.807153] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-befa030e-aacf-44e7-bd30-f7f288126922 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2396.811804] env[68906]: DEBUG oslo_vmware.api [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Waiting for the task: (returnval){ [ 2396.811804] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]526d1abb-8174-a745-eb26-6712ba91d464" [ 2396.811804] env[68906]: _type = "Task" [ 2396.811804] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2396.824633] env[68906]: DEBUG oslo_vmware.api [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]526d1abb-8174-a745-eb26-6712ba91d464, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2396.873644] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Unregistered the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2396.873852] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Deleting contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2396.874045] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Deleting the datastore file [datastore2] 01b79dfa-cd20-495d-b112-8429c28b741e {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2396.874318] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-a8be693d-ae1b-4aa6-90b6-0d8c2ec87e72 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2396.880709] env[68906]: DEBUG oslo_vmware.api [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Waiting for the task: (returnval){ [ 2396.880709] env[68906]: value = "task-3475485" [ 2396.880709] env[68906]: _type = "Task" [ 2396.880709] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2396.887997] env[68906]: DEBUG oslo_vmware.api [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Task: {'id': task-3475485, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2397.321931] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Preparing fetch location {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2397.322232] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Creating directory with path [datastore2] vmware_temp/6751d51c-5fa0-41d5-a5d3-d794bad71a2c/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2397.322438] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1309283b-a226-4a0e-b6c2-7ffb9f05f82d {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2397.334414] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Created directory with path [datastore2] vmware_temp/6751d51c-5fa0-41d5-a5d3-d794bad71a2c/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2397.334607] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Fetch image to [datastore2] vmware_temp/6751d51c-5fa0-41d5-a5d3-d794bad71a2c/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2397.334804] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to [datastore2] vmware_temp/6751d51c-5fa0-41d5-a5d3-d794bad71a2c/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2397.335513] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1047ba05-2258-4352-a3c5-11738d413856 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2397.341876] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a67d773a-2827-424b-a58a-f8cf58b0adad {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2397.350523] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc5aee26-9b9a-4439-8654-d481a5d8c260 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2397.380156] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b8884bcd-fa8d-49b9-9341-f942a0471281 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2397.390061] env[68906]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-6cc0ccf0-860e-490c-9bde-1f0407d220d1 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2397.391622] env[68906]: DEBUG oslo_vmware.api [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Task: {'id': task-3475485, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.079977} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2397.391849] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Deleted the datastore file {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2397.392038] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Deleted contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2397.392212] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2397.392381] env[68906]: INFO nova.compute.manager [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2397.394393] env[68906]: DEBUG nova.compute.claims [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Aborting claim: {{(pid=68906) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2397.394573] env[68906]: DEBUG oslo_concurrency.lockutils [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2397.394797] env[68906]: DEBUG oslo_concurrency.lockutils [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2397.412943] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2397.526330] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-90114ab8-351e-424d-a82a-8e32df87e542 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2397.530225] env[68906]: DEBUG oslo_vmware.rw_handles [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/6751d51c-5fa0-41d5-a5d3-d794bad71a2c/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2397.586589] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6dc9a714-0960-4361-9d03-0df01368272a {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2397.592286] env[68906]: DEBUG oslo_vmware.rw_handles [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Completed reading data from the image iterator. {{(pid=68906) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2397.592480] env[68906]: DEBUG oslo_vmware.rw_handles [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/6751d51c-5fa0-41d5-a5d3-d794bad71a2c/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2397.619282] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-03c161c8-d7ab-45f7-8d78-9cf64ff7fb57 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2397.626794] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c991b9f-7996-424b-b766-1858d511b3ad {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2397.639966] env[68906]: DEBUG nova.compute.provider_tree [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2397.648389] env[68906]: DEBUG nova.scheduler.client.report [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2397.664421] env[68906]: DEBUG oslo_concurrency.lockutils [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.270s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2397.664964] env[68906]: ERROR nova.compute.manager [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2397.664964] env[68906]: Faults: ['InvalidArgument'] [ 2397.664964] env[68906]: ERROR nova.compute.manager [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Traceback (most recent call last): [ 2397.664964] env[68906]: ERROR nova.compute.manager [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2397.664964] env[68906]: ERROR nova.compute.manager [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] self.driver.spawn(context, instance, image_meta, [ 2397.664964] env[68906]: ERROR nova.compute.manager [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2397.664964] env[68906]: ERROR nova.compute.manager [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2397.664964] env[68906]: ERROR nova.compute.manager [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2397.664964] env[68906]: ERROR nova.compute.manager [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] self._fetch_image_if_missing(context, vi) [ 2397.664964] env[68906]: ERROR nova.compute.manager [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2397.664964] env[68906]: ERROR nova.compute.manager [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] image_cache(vi, tmp_image_ds_loc) [ 2397.664964] env[68906]: ERROR nova.compute.manager [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2397.665423] env[68906]: ERROR nova.compute.manager [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] vm_util.copy_virtual_disk( [ 2397.665423] env[68906]: ERROR nova.compute.manager [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2397.665423] env[68906]: ERROR nova.compute.manager [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] session._wait_for_task(vmdk_copy_task) [ 2397.665423] env[68906]: ERROR nova.compute.manager [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2397.665423] env[68906]: ERROR nova.compute.manager [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] return self.wait_for_task(task_ref) [ 2397.665423] env[68906]: ERROR nova.compute.manager [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2397.665423] env[68906]: ERROR nova.compute.manager [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] return evt.wait() [ 2397.665423] env[68906]: ERROR nova.compute.manager [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2397.665423] env[68906]: ERROR nova.compute.manager [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] result = hub.switch() [ 2397.665423] env[68906]: ERROR nova.compute.manager [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2397.665423] env[68906]: ERROR nova.compute.manager [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] return self.greenlet.switch() [ 2397.665423] env[68906]: ERROR nova.compute.manager [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2397.665423] env[68906]: ERROR nova.compute.manager [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] self.f(*self.args, **self.kw) [ 2397.665778] env[68906]: ERROR nova.compute.manager [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2397.665778] env[68906]: ERROR nova.compute.manager [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] raise exceptions.translate_fault(task_info.error) [ 2397.665778] env[68906]: ERROR nova.compute.manager [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2397.665778] env[68906]: ERROR nova.compute.manager [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Faults: ['InvalidArgument'] [ 2397.665778] env[68906]: ERROR nova.compute.manager [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] [ 2397.665778] env[68906]: DEBUG nova.compute.utils [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] VimFaultException {{(pid=68906) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2397.667442] env[68906]: DEBUG nova.compute.manager [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Build of instance 01b79dfa-cd20-495d-b112-8429c28b741e was re-scheduled: A specified parameter was not correct: fileType [ 2397.667442] env[68906]: Faults: ['InvalidArgument'] {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2397.667866] env[68906]: DEBUG nova.compute.manager [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Unplugging VIFs for instance {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2397.668077] env[68906]: DEBUG nova.compute.manager [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2397.668272] env[68906]: DEBUG nova.compute.manager [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2397.668435] env[68906]: DEBUG nova.network.neutron [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2398.033960] env[68906]: DEBUG nova.network.neutron [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2398.044886] env[68906]: INFO nova.compute.manager [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Took 0.38 seconds to deallocate network for instance. [ 2398.140877] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2398.143631] env[68906]: INFO nova.scheduler.client.report [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Deleted allocations for instance 01b79dfa-cd20-495d-b112-8429c28b741e [ 2398.166617] env[68906]: DEBUG oslo_concurrency.lockutils [None req-f44832be-2f4a-4582-8745-029b5c53cab6 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Lock "01b79dfa-cd20-495d-b112-8429c28b741e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 624.251s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2398.166947] env[68906]: DEBUG oslo_concurrency.lockutils [None req-39b2953e-c229-4fbf-a043-48005b810281 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Lock "01b79dfa-cd20-495d-b112-8429c28b741e" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 428.247s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2398.167221] env[68906]: DEBUG oslo_concurrency.lockutils [None req-39b2953e-c229-4fbf-a043-48005b810281 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Acquiring lock "01b79dfa-cd20-495d-b112-8429c28b741e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2398.167439] env[68906]: DEBUG oslo_concurrency.lockutils [None req-39b2953e-c229-4fbf-a043-48005b810281 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Lock "01b79dfa-cd20-495d-b112-8429c28b741e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2398.167607] env[68906]: DEBUG oslo_concurrency.lockutils [None req-39b2953e-c229-4fbf-a043-48005b810281 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Lock "01b79dfa-cd20-495d-b112-8429c28b741e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2398.169552] env[68906]: INFO nova.compute.manager [None req-39b2953e-c229-4fbf-a043-48005b810281 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Terminating instance [ 2398.171592] env[68906]: DEBUG nova.compute.manager [None req-39b2953e-c229-4fbf-a043-48005b810281 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2398.171592] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-39b2953e-c229-4fbf-a043-48005b810281 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2398.172029] env[68906]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-3f6b9803-9c7e-4048-8eb1-b9d1f372d8ca {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2398.181377] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f9823419-125f-4f6c-a0f9-85c2b517c6d9 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2398.209914] env[68906]: WARNING nova.virt.vmwareapi.vmops [None req-39b2953e-c229-4fbf-a043-48005b810281 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 01b79dfa-cd20-495d-b112-8429c28b741e could not be found. [ 2398.209914] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-39b2953e-c229-4fbf-a043-48005b810281 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2398.210074] env[68906]: INFO nova.compute.manager [None req-39b2953e-c229-4fbf-a043-48005b810281 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2398.210304] env[68906]: DEBUG oslo.service.loopingcall [None req-39b2953e-c229-4fbf-a043-48005b810281 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2398.210509] env[68906]: DEBUG nova.compute.manager [-] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2398.211216] env[68906]: DEBUG nova.network.neutron [-] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2398.249231] env[68906]: DEBUG nova.network.neutron [-] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2398.257608] env[68906]: INFO nova.compute.manager [-] [instance: 01b79dfa-cd20-495d-b112-8429c28b741e] Took 0.05 seconds to deallocate network for instance. [ 2398.348034] env[68906]: DEBUG oslo_concurrency.lockutils [None req-39b2953e-c229-4fbf-a043-48005b810281 tempest-ServerActionsTestOtherA-1507860010 tempest-ServerActionsTestOtherA-1507860010-project-member] Lock "01b79dfa-cd20-495d-b112-8429c28b741e" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.181s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2399.140546] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2399.140785] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Starting heal instance info cache {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2399.140971] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Rebuilding the list of instances to heal {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2399.156983] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2399.157149] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: d70b039d-c8ad-4ffd-84f8-08f17cb97578] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2399.157278] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: ed276c3c-6085-427d-b3b7-86bbb8660dbc] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2399.157411] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: cd208e67-55a3-4c0b-ad49-abd3a700d5ef] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2399.157532] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: a4f1c6a3-c189-4e3a-8ac9-ac6ec8b95723] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2399.157658] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 0d12bd8f-0e92-4066-9ada-6eff7b4c5dbe] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2399.157786] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Didn't find any instances for network info cache update. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2400.141069] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2401.141365] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2401.141679] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Cleaning up deleted instances with incomplete migration {{(pid=68906) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 2402.149713] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2403.141535] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2404.410014] env[68906]: DEBUG oslo_concurrency.lockutils [None req-05761f84-c3ce-4acf-ad3c-eb082e513226 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Acquiring lock "0d12bd8f-0e92-4066-9ada-6eff7b4c5dbe" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2407.141221] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2407.141525] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68906) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2410.136602] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2410.154438] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2410.154938] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Getting list of instances from cluster (obj){ [ 2410.154938] env[68906]: value = "domain-c8" [ 2410.154938] env[68906]: _type = "ClusterComputeResource" [ 2410.154938] env[68906]: } {{(pid=68906) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 2410.155970] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5f198f11-6265-43b8-b58d-c0a8fcadb9b1 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2410.171752] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Got total of 6 instances {{(pid=68906) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 2412.186150] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2413.140998] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2414.148985] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager.update_available_resource {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2414.160052] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2414.162068] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2414.162068] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2414.162068] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68906) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2414.162068] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-20ddc226-2cd6-431a-ab3d-fc5db99c7a86 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2414.171654] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-878b35d0-1d16-49fe-8926-94cbb79818a5 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2414.186334] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c36ae730-04a1-4fbb-8ab3-77157acdb6ed {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2414.192614] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd2fa30f-7e61-476d-a51b-84490df15eb5 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2414.222173] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180937MB free_disk=93GB free_vcpus=48 pci_devices=None {{(pid=68906) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2414.222505] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2414.222846] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2414.336674] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 8bfc91d4-b1d7-449a-8d48-0e63490fe663 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2414.337096] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance d70b039d-c8ad-4ffd-84f8-08f17cb97578 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2414.338103] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance ed276c3c-6085-427d-b3b7-86bbb8660dbc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2414.338103] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance cd208e67-55a3-4c0b-ad49-abd3a700d5ef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2414.338103] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance a4f1c6a3-c189-4e3a-8ac9-ac6ec8b95723 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2414.338103] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 0d12bd8f-0e92-4066-9ada-6eff7b4c5dbe actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2414.338480] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Total usable vcpus: 48, total allocated vcpus: 6 {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2414.338480] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1280MB phys_disk=200GB used_disk=6GB total_vcpus=48 used_vcpus=6 pci_stats=[] {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2414.355048] env[68906]: DEBUG nova.scheduler.client.report [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Refreshing inventories for resource provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 2414.368556] env[68906]: DEBUG nova.scheduler.client.report [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Updating ProviderTree inventory for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 2414.368556] env[68906]: DEBUG nova.compute.provider_tree [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Updating inventory in ProviderTree for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 2414.377134] env[68906]: DEBUG nova.scheduler.client.report [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Refreshing aggregate associations for resource provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b, aggregates: None {{(pid=68906) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 2414.395563] env[68906]: DEBUG nova.scheduler.client.report [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Refreshing trait associations for resource provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b, traits: COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ISO {{(pid=68906) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 2414.475701] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b5ee287-a8cc-4e6e-ac38-ce82c713ea74 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2414.482775] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d13c672-9317-4561-ab0a-ebd0bf3b50f6 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2414.512711] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-40f76916-c7e6-4671-ab25-0af8e9df40e5 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2414.519057] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1f5eb81b-1843-4c18-98dd-6079cd97d341 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2414.532879] env[68906]: DEBUG nova.compute.provider_tree [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2414.541820] env[68906]: DEBUG nova.scheduler.client.report [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2414.586094] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68906) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2414.586246] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.363s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2415.140163] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2415.140347] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Cleaning up deleted instances {{(pid=68906) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 2415.149922] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] There are 0 instances to clean {{(pid=68906) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 2445.690194] env[68906]: WARNING oslo_vmware.rw_handles [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2445.690194] env[68906]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2445.690194] env[68906]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2445.690194] env[68906]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2445.690194] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2445.690194] env[68906]: ERROR oslo_vmware.rw_handles response.begin() [ 2445.690194] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2445.690194] env[68906]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2445.690194] env[68906]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2445.690194] env[68906]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2445.690194] env[68906]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2445.690194] env[68906]: ERROR oslo_vmware.rw_handles [ 2445.696166] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Downloaded image file data b1400c31-d33b-4e13-944f-4c645e62493e to vmware_temp/6751d51c-5fa0-41d5-a5d3-d794bad71a2c/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2445.696166] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Caching image {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2445.696166] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Copying Virtual Disk [datastore2] vmware_temp/6751d51c-5fa0-41d5-a5d3-d794bad71a2c/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk to [datastore2] vmware_temp/6751d51c-5fa0-41d5-a5d3-d794bad71a2c/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk {{(pid=68906) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2445.696166] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-d53a3417-0590-493e-9fb4-b2453780817d {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2445.706115] env[68906]: DEBUG oslo_vmware.api [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Waiting for the task: (returnval){ [ 2445.706115] env[68906]: value = "task-3475486" [ 2445.706115] env[68906]: _type = "Task" [ 2445.706115] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2445.720919] env[68906]: DEBUG oslo_vmware.api [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Task: {'id': task-3475486, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2446.121597] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._sync_power_states {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2446.138692] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Getting list of instances from cluster (obj){ [ 2446.138692] env[68906]: value = "domain-c8" [ 2446.138692] env[68906]: _type = "ClusterComputeResource" [ 2446.138692] env[68906]: } {{(pid=68906) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 2446.140087] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-353ab67b-de99-453e-a3e4-d4e2e2301704 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2446.154197] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Got total of 6 instances {{(pid=68906) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 2446.154397] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Triggering sync for uuid 8bfc91d4-b1d7-449a-8d48-0e63490fe663 {{(pid=68906) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 2446.154554] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Triggering sync for uuid d70b039d-c8ad-4ffd-84f8-08f17cb97578 {{(pid=68906) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 2446.154711] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Triggering sync for uuid ed276c3c-6085-427d-b3b7-86bbb8660dbc {{(pid=68906) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 2446.154901] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Triggering sync for uuid cd208e67-55a3-4c0b-ad49-abd3a700d5ef {{(pid=68906) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 2446.155075] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Triggering sync for uuid a4f1c6a3-c189-4e3a-8ac9-ac6ec8b95723 {{(pid=68906) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 2446.155235] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Triggering sync for uuid 0d12bd8f-0e92-4066-9ada-6eff7b4c5dbe {{(pid=68906) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 2446.155537] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "8bfc91d4-b1d7-449a-8d48-0e63490fe663" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2446.155763] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "d70b039d-c8ad-4ffd-84f8-08f17cb97578" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2446.155962] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "ed276c3c-6085-427d-b3b7-86bbb8660dbc" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2446.156199] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "cd208e67-55a3-4c0b-ad49-abd3a700d5ef" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2446.156417] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "a4f1c6a3-c189-4e3a-8ac9-ac6ec8b95723" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2446.156611] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "0d12bd8f-0e92-4066-9ada-6eff7b4c5dbe" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2446.215305] env[68906]: DEBUG oslo_vmware.exceptions [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Fault InvalidArgument not matched. {{(pid=68906) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2446.215562] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2446.216109] env[68906]: ERROR nova.compute.manager [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2446.216109] env[68906]: Faults: ['InvalidArgument'] [ 2446.216109] env[68906]: ERROR nova.compute.manager [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Traceback (most recent call last): [ 2446.216109] env[68906]: ERROR nova.compute.manager [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2446.216109] env[68906]: ERROR nova.compute.manager [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] yield resources [ 2446.216109] env[68906]: ERROR nova.compute.manager [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2446.216109] env[68906]: ERROR nova.compute.manager [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] self.driver.spawn(context, instance, image_meta, [ 2446.216109] env[68906]: ERROR nova.compute.manager [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2446.216109] env[68906]: ERROR nova.compute.manager [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2446.216109] env[68906]: ERROR nova.compute.manager [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2446.216109] env[68906]: ERROR nova.compute.manager [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] self._fetch_image_if_missing(context, vi) [ 2446.216109] env[68906]: ERROR nova.compute.manager [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2446.216453] env[68906]: ERROR nova.compute.manager [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] image_cache(vi, tmp_image_ds_loc) [ 2446.216453] env[68906]: ERROR nova.compute.manager [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2446.216453] env[68906]: ERROR nova.compute.manager [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] vm_util.copy_virtual_disk( [ 2446.216453] env[68906]: ERROR nova.compute.manager [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2446.216453] env[68906]: ERROR nova.compute.manager [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] session._wait_for_task(vmdk_copy_task) [ 2446.216453] env[68906]: ERROR nova.compute.manager [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2446.216453] env[68906]: ERROR nova.compute.manager [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] return self.wait_for_task(task_ref) [ 2446.216453] env[68906]: ERROR nova.compute.manager [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2446.216453] env[68906]: ERROR nova.compute.manager [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] return evt.wait() [ 2446.216453] env[68906]: ERROR nova.compute.manager [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2446.216453] env[68906]: ERROR nova.compute.manager [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] result = hub.switch() [ 2446.216453] env[68906]: ERROR nova.compute.manager [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2446.216453] env[68906]: ERROR nova.compute.manager [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] return self.greenlet.switch() [ 2446.216828] env[68906]: ERROR nova.compute.manager [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2446.216828] env[68906]: ERROR nova.compute.manager [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] self.f(*self.args, **self.kw) [ 2446.216828] env[68906]: ERROR nova.compute.manager [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2446.216828] env[68906]: ERROR nova.compute.manager [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] raise exceptions.translate_fault(task_info.error) [ 2446.216828] env[68906]: ERROR nova.compute.manager [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2446.216828] env[68906]: ERROR nova.compute.manager [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Faults: ['InvalidArgument'] [ 2446.216828] env[68906]: ERROR nova.compute.manager [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] [ 2446.216828] env[68906]: INFO nova.compute.manager [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Terminating instance [ 2446.217887] env[68906]: DEBUG oslo_concurrency.lockutils [None req-4077b494-948d-40fb-ba1f-1a5eae01fbe0 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2446.218130] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-4077b494-948d-40fb-ba1f-1a5eae01fbe0 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2446.218369] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-80ef9c7e-a27b-4ac4-a86a-97fbd8efa4e3 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2446.220467] env[68906]: DEBUG nova.compute.manager [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2446.220656] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2446.221350] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a5a5b2d6-c11b-433a-8ef7-2e35c0dafe45 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2446.228520] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Unregistering the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2446.229368] env[68906]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-39ad2e08-132a-42f0-a99c-68e311442817 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2446.230653] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-4077b494-948d-40fb-ba1f-1a5eae01fbe0 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2446.230826] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-4077b494-948d-40fb-ba1f-1a5eae01fbe0 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=68906) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2446.231474] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-23c46bb6-ab6e-4f48-ab85-594399c201e6 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2446.236129] env[68906]: DEBUG oslo_vmware.api [None req-4077b494-948d-40fb-ba1f-1a5eae01fbe0 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Waiting for the task: (returnval){ [ 2446.236129] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]52a4e5b8-911a-2854-d50b-6da0a9a66b5c" [ 2446.236129] env[68906]: _type = "Task" [ 2446.236129] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2446.243206] env[68906]: DEBUG oslo_vmware.api [None req-4077b494-948d-40fb-ba1f-1a5eae01fbe0 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]52a4e5b8-911a-2854-d50b-6da0a9a66b5c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2446.300415] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Unregistered the VM {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2446.300692] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Deleting contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2446.300845] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Deleting the datastore file [datastore2] 8bfc91d4-b1d7-449a-8d48-0e63490fe663 {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2446.301084] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-aa9dd157-ff44-43d7-b23e-f17d243eb68e {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2446.307620] env[68906]: DEBUG oslo_vmware.api [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Waiting for the task: (returnval){ [ 2446.307620] env[68906]: value = "task-3475488" [ 2446.307620] env[68906]: _type = "Task" [ 2446.307620] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2446.316363] env[68906]: DEBUG oslo_vmware.api [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Task: {'id': task-3475488, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2446.746751] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-4077b494-948d-40fb-ba1f-1a5eae01fbe0 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: d70b039d-c8ad-4ffd-84f8-08f17cb97578] Preparing fetch location {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2446.747116] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-4077b494-948d-40fb-ba1f-1a5eae01fbe0 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Creating directory with path [datastore2] vmware_temp/0f613bd9-f3b2-4f2b-a7c4-53d563e2b8ab/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2446.747278] env[68906]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0fcf8145-4a66-42d4-ae51-e9ed99abddf4 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2446.778471] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-4077b494-948d-40fb-ba1f-1a5eae01fbe0 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Created directory with path [datastore2] vmware_temp/0f613bd9-f3b2-4f2b-a7c4-53d563e2b8ab/b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2446.778662] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-4077b494-948d-40fb-ba1f-1a5eae01fbe0 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: d70b039d-c8ad-4ffd-84f8-08f17cb97578] Fetch image to [datastore2] vmware_temp/0f613bd9-f3b2-4f2b-a7c4-53d563e2b8ab/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2446.778830] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-4077b494-948d-40fb-ba1f-1a5eae01fbe0 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: d70b039d-c8ad-4ffd-84f8-08f17cb97578] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to [datastore2] vmware_temp/0f613bd9-f3b2-4f2b-a7c4-53d563e2b8ab/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk on the data store datastore2 {{(pid=68906) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2446.779615] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5ee1f012-bc3b-4d5d-a7fb-291f08e1504c {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2446.786292] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e49ebe73-c89e-46d8-bbb1-708349a69729 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2446.795414] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5e8147dc-1e27-409a-93c2-95b0b7149bcd {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2446.827382] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-234be7a1-a02c-43eb-81a8-2d28f348f5a3 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2446.833780] env[68906]: DEBUG oslo_vmware.api [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Task: {'id': task-3475488, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.076813} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2446.835631] env[68906]: DEBUG nova.virt.vmwareapi.ds_util [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Deleted the datastore file {{(pid=68906) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2446.835896] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Deleted contents of the VM from datastore datastore2 {{(pid=68906) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2446.836104] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2446.836289] env[68906]: INFO nova.compute.manager [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Took 0.62 seconds to destroy the instance on the hypervisor. [ 2446.838054] env[68906]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-cdd99da9-ebf6-4238-8ed9-6973e1851eea {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2446.840164] env[68906]: DEBUG nova.compute.claims [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Aborting claim: {{(pid=68906) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2446.840339] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2446.840546] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2446.861984] env[68906]: DEBUG nova.virt.vmwareapi.images [None req-4077b494-948d-40fb-ba1f-1a5eae01fbe0 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] [instance: d70b039d-c8ad-4ffd-84f8-08f17cb97578] Downloading image file data b1400c31-d33b-4e13-944f-4c645e62493e to the data store datastore2 {{(pid=68906) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2447.001438] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-adbe2651-8509-4f0a-8cae-36cff3608fc3 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2447.010762] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-47ee8b60-38f1-41b1-b5de-b0bc853de61e {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2447.048656] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09ed936b-9382-495e-a7b8-3b81592ce0dc {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2447.056270] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ab78aec-fba1-42f6-8965-6f02eed355e7 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2447.070396] env[68906]: DEBUG nova.compute.provider_tree [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2447.077819] env[68906]: DEBUG oslo_vmware.rw_handles [None req-4077b494-948d-40fb-ba1f-1a5eae01fbe0 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0f613bd9-f3b2-4f2b-a7c4-53d563e2b8ab/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2447.079364] env[68906]: DEBUG nova.scheduler.client.report [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2447.140735] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.300s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2447.141307] env[68906]: ERROR nova.compute.manager [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2447.141307] env[68906]: Faults: ['InvalidArgument'] [ 2447.141307] env[68906]: ERROR nova.compute.manager [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Traceback (most recent call last): [ 2447.141307] env[68906]: ERROR nova.compute.manager [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2447.141307] env[68906]: ERROR nova.compute.manager [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] self.driver.spawn(context, instance, image_meta, [ 2447.141307] env[68906]: ERROR nova.compute.manager [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2447.141307] env[68906]: ERROR nova.compute.manager [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2447.141307] env[68906]: ERROR nova.compute.manager [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2447.141307] env[68906]: ERROR nova.compute.manager [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] self._fetch_image_if_missing(context, vi) [ 2447.141307] env[68906]: ERROR nova.compute.manager [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2447.141307] env[68906]: ERROR nova.compute.manager [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] image_cache(vi, tmp_image_ds_loc) [ 2447.141307] env[68906]: ERROR nova.compute.manager [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2447.141650] env[68906]: ERROR nova.compute.manager [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] vm_util.copy_virtual_disk( [ 2447.141650] env[68906]: ERROR nova.compute.manager [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2447.141650] env[68906]: ERROR nova.compute.manager [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] session._wait_for_task(vmdk_copy_task) [ 2447.141650] env[68906]: ERROR nova.compute.manager [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2447.141650] env[68906]: ERROR nova.compute.manager [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] return self.wait_for_task(task_ref) [ 2447.141650] env[68906]: ERROR nova.compute.manager [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2447.141650] env[68906]: ERROR nova.compute.manager [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] return evt.wait() [ 2447.141650] env[68906]: ERROR nova.compute.manager [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2447.141650] env[68906]: ERROR nova.compute.manager [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] result = hub.switch() [ 2447.141650] env[68906]: ERROR nova.compute.manager [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2447.141650] env[68906]: ERROR nova.compute.manager [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] return self.greenlet.switch() [ 2447.141650] env[68906]: ERROR nova.compute.manager [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2447.141650] env[68906]: ERROR nova.compute.manager [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] self.f(*self.args, **self.kw) [ 2447.142014] env[68906]: ERROR nova.compute.manager [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2447.142014] env[68906]: ERROR nova.compute.manager [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] raise exceptions.translate_fault(task_info.error) [ 2447.142014] env[68906]: ERROR nova.compute.manager [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2447.142014] env[68906]: ERROR nova.compute.manager [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Faults: ['InvalidArgument'] [ 2447.142014] env[68906]: ERROR nova.compute.manager [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] [ 2447.142014] env[68906]: DEBUG nova.compute.utils [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] VimFaultException {{(pid=68906) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2447.143519] env[68906]: DEBUG nova.compute.manager [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Build of instance 8bfc91d4-b1d7-449a-8d48-0e63490fe663 was re-scheduled: A specified parameter was not correct: fileType [ 2447.143519] env[68906]: Faults: ['InvalidArgument'] {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2447.143882] env[68906]: DEBUG nova.compute.manager [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Unplugging VIFs for instance {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2447.144070] env[68906]: DEBUG nova.compute.manager [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68906) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2447.144245] env[68906]: DEBUG nova.compute.manager [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2447.144411] env[68906]: DEBUG nova.network.neutron [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2447.148201] env[68906]: DEBUG oslo_vmware.rw_handles [None req-4077b494-948d-40fb-ba1f-1a5eae01fbe0 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Completed reading data from the image iterator. {{(pid=68906) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2447.148370] env[68906]: DEBUG oslo_vmware.rw_handles [None req-4077b494-948d-40fb-ba1f-1a5eae01fbe0 tempest-AttachVolumeTestJSON-1667500444 tempest-AttachVolumeTestJSON-1667500444-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0f613bd9-f3b2-4f2b-a7c4-53d563e2b8ab/b1400c31-d33b-4e13-944f-4c645e62493e/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=68906) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2447.455446] env[68906]: DEBUG nova.network.neutron [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2447.473435] env[68906]: INFO nova.compute.manager [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Took 0.33 seconds to deallocate network for instance. [ 2447.574051] env[68906]: INFO nova.scheduler.client.report [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Deleted allocations for instance 8bfc91d4-b1d7-449a-8d48-0e63490fe663 [ 2447.603641] env[68906]: DEBUG oslo_concurrency.lockutils [None req-6ecfa4f2-113a-4e91-9370-b3cb03d6f639 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Lock "8bfc91d4-b1d7-449a-8d48-0e63490fe663" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 632.677s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2447.603996] env[68906]: DEBUG oslo_concurrency.lockutils [None req-e3d0afc1-6979-43e2-885d-82661245d8f2 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Lock "8bfc91d4-b1d7-449a-8d48-0e63490fe663" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 436.270s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2447.604148] env[68906]: DEBUG oslo_concurrency.lockutils [None req-e3d0afc1-6979-43e2-885d-82661245d8f2 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Acquiring lock "8bfc91d4-b1d7-449a-8d48-0e63490fe663-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2447.604377] env[68906]: DEBUG oslo_concurrency.lockutils [None req-e3d0afc1-6979-43e2-885d-82661245d8f2 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Lock "8bfc91d4-b1d7-449a-8d48-0e63490fe663-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2447.604578] env[68906]: DEBUG oslo_concurrency.lockutils [None req-e3d0afc1-6979-43e2-885d-82661245d8f2 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Lock "8bfc91d4-b1d7-449a-8d48-0e63490fe663-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2447.606629] env[68906]: INFO nova.compute.manager [None req-e3d0afc1-6979-43e2-885d-82661245d8f2 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Terminating instance [ 2447.608257] env[68906]: DEBUG nova.compute.manager [None req-e3d0afc1-6979-43e2-885d-82661245d8f2 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Start destroying the instance on the hypervisor. {{(pid=68906) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2447.608437] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-e3d0afc1-6979-43e2-885d-82661245d8f2 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Destroying instance {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2447.608884] env[68906]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-1d64a11f-0924-45ac-bebd-0fb5e35fd88b {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2447.618445] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bca05623-3dd9-41de-8f2a-e484eb83f685 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2447.646631] env[68906]: WARNING nova.virt.vmwareapi.vmops [None req-e3d0afc1-6979-43e2-885d-82661245d8f2 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 8bfc91d4-b1d7-449a-8d48-0e63490fe663 could not be found. [ 2447.647030] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-e3d0afc1-6979-43e2-885d-82661245d8f2 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Instance destroyed {{(pid=68906) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2447.647030] env[68906]: INFO nova.compute.manager [None req-e3d0afc1-6979-43e2-885d-82661245d8f2 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2447.648023] env[68906]: DEBUG oslo.service.loopingcall [None req-e3d0afc1-6979-43e2-885d-82661245d8f2 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2447.648023] env[68906]: DEBUG nova.compute.manager [-] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Deallocating network for instance {{(pid=68906) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2447.648023] env[68906]: DEBUG nova.network.neutron [-] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] deallocate_for_instance() {{(pid=68906) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2447.681836] env[68906]: DEBUG nova.network.neutron [-] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Updating instance_info_cache with network_info: [] {{(pid=68906) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2447.693093] env[68906]: INFO nova.compute.manager [-] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] Took 0.05 seconds to deallocate network for instance. [ 2447.798010] env[68906]: DEBUG oslo_concurrency.lockutils [None req-e3d0afc1-6979-43e2-885d-82661245d8f2 tempest-ServersTestJSON-1226730598 tempest-ServersTestJSON-1226730598-project-member] Lock "8bfc91d4-b1d7-449a-8d48-0e63490fe663" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.194s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2447.799424] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "8bfc91d4-b1d7-449a-8d48-0e63490fe663" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 1.644s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2447.799745] env[68906]: INFO nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 8bfc91d4-b1d7-449a-8d48-0e63490fe663] During sync_power_state the instance has a pending task (deleting). Skip. [ 2447.800009] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "8bfc91d4-b1d7-449a-8d48-0e63490fe663" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2454.177641] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2459.141613] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2459.142052] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Starting heal instance info cache {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2459.142052] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Rebuilding the list of instances to heal {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2459.160407] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: d70b039d-c8ad-4ffd-84f8-08f17cb97578] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2459.160604] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: ed276c3c-6085-427d-b3b7-86bbb8660dbc] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2459.160703] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: cd208e67-55a3-4c0b-ad49-abd3a700d5ef] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2459.160834] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: a4f1c6a3-c189-4e3a-8ac9-ac6ec8b95723] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2459.160958] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] [instance: 0d12bd8f-0e92-4066-9ada-6eff7b4c5dbe] Skipping network cache update for instance because it is Building. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2459.161097] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Didn't find any instances for network info cache update. {{(pid=68906) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2459.161642] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2461.140990] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2463.141686] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2464.140465] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2466.386142] env[68906]: DEBUG oslo_concurrency.lockutils [None req-427413b8-758c-4244-881d-f397aaaba2c1 tempest-ServerShowV257Test-663276056 tempest-ServerShowV257Test-663276056-project-member] Acquiring lock "456ee279-7956-4307-92a0-723cb528c531" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2466.386142] env[68906]: DEBUG oslo_concurrency.lockutils [None req-427413b8-758c-4244-881d-f397aaaba2c1 tempest-ServerShowV257Test-663276056 tempest-ServerShowV257Test-663276056-project-member] Lock "456ee279-7956-4307-92a0-723cb528c531" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2466.396113] env[68906]: DEBUG nova.compute.manager [None req-427413b8-758c-4244-881d-f397aaaba2c1 tempest-ServerShowV257Test-663276056 tempest-ServerShowV257Test-663276056-project-member] [instance: 456ee279-7956-4307-92a0-723cb528c531] Starting instance... {{(pid=68906) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2466.444584] env[68906]: DEBUG oslo_concurrency.lockutils [None req-427413b8-758c-4244-881d-f397aaaba2c1 tempest-ServerShowV257Test-663276056 tempest-ServerShowV257Test-663276056-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2466.444835] env[68906]: DEBUG oslo_concurrency.lockutils [None req-427413b8-758c-4244-881d-f397aaaba2c1 tempest-ServerShowV257Test-663276056 tempest-ServerShowV257Test-663276056-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2466.446475] env[68906]: INFO nova.compute.claims [None req-427413b8-758c-4244-881d-f397aaaba2c1 tempest-ServerShowV257Test-663276056 tempest-ServerShowV257Test-663276056-project-member] [instance: 456ee279-7956-4307-92a0-723cb528c531] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2466.562220] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2086f256-a166-4df1-9866-f869a895dd11 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2466.570777] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bcbf02f0-6242-4eab-8109-744bfada4686 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2466.601786] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62ce782c-27c7-48bd-9ce6-52385a1358f1 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2466.608939] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf19b04f-b8d9-472d-816d-31ee10ef2608 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2466.622181] env[68906]: DEBUG nova.compute.provider_tree [None req-427413b8-758c-4244-881d-f397aaaba2c1 tempest-ServerShowV257Test-663276056 tempest-ServerShowV257Test-663276056-project-member] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2466.631627] env[68906]: DEBUG nova.scheduler.client.report [None req-427413b8-758c-4244-881d-f397aaaba2c1 tempest-ServerShowV257Test-663276056 tempest-ServerShowV257Test-663276056-project-member] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2466.644449] env[68906]: DEBUG oslo_concurrency.lockutils [None req-427413b8-758c-4244-881d-f397aaaba2c1 tempest-ServerShowV257Test-663276056 tempest-ServerShowV257Test-663276056-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.200s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2466.644887] env[68906]: DEBUG nova.compute.manager [None req-427413b8-758c-4244-881d-f397aaaba2c1 tempest-ServerShowV257Test-663276056 tempest-ServerShowV257Test-663276056-project-member] [instance: 456ee279-7956-4307-92a0-723cb528c531] Start building networks asynchronously for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 2466.676462] env[68906]: DEBUG nova.compute.utils [None req-427413b8-758c-4244-881d-f397aaaba2c1 tempest-ServerShowV257Test-663276056 tempest-ServerShowV257Test-663276056-project-member] Using /dev/sd instead of None {{(pid=68906) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2466.677739] env[68906]: DEBUG nova.compute.manager [None req-427413b8-758c-4244-881d-f397aaaba2c1 tempest-ServerShowV257Test-663276056 tempest-ServerShowV257Test-663276056-project-member] [instance: 456ee279-7956-4307-92a0-723cb528c531] Not allocating networking since 'none' was specified. {{(pid=68906) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 2466.686955] env[68906]: DEBUG nova.compute.manager [None req-427413b8-758c-4244-881d-f397aaaba2c1 tempest-ServerShowV257Test-663276056 tempest-ServerShowV257Test-663276056-project-member] [instance: 456ee279-7956-4307-92a0-723cb528c531] Start building block device mappings for instance. {{(pid=68906) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 2466.746612] env[68906]: DEBUG nova.compute.manager [None req-427413b8-758c-4244-881d-f397aaaba2c1 tempest-ServerShowV257Test-663276056 tempest-ServerShowV257Test-663276056-project-member] [instance: 456ee279-7956-4307-92a0-723cb528c531] Start spawning the instance on the hypervisor. {{(pid=68906) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 2466.771107] env[68906]: DEBUG nova.virt.hardware [None req-427413b8-758c-4244-881d-f397aaaba2c1 tempest-ServerShowV257Test-663276056 tempest-ServerShowV257Test-663276056-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T13:00:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T13:00:23Z,direct_url=,disk_format='vmdk',id=b1400c31-d33b-4e13-944f-4c645e62493e,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='1ae7bf3a375d41c6af5e7536af51ffd1',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T13:00:24Z,virtual_size=,visibility=), allow threads: False {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2466.771360] env[68906]: DEBUG nova.virt.hardware [None req-427413b8-758c-4244-881d-f397aaaba2c1 tempest-ServerShowV257Test-663276056 tempest-ServerShowV257Test-663276056-project-member] Flavor limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2466.771515] env[68906]: DEBUG nova.virt.hardware [None req-427413b8-758c-4244-881d-f397aaaba2c1 tempest-ServerShowV257Test-663276056 tempest-ServerShowV257Test-663276056-project-member] Image limits 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2466.771692] env[68906]: DEBUG nova.virt.hardware [None req-427413b8-758c-4244-881d-f397aaaba2c1 tempest-ServerShowV257Test-663276056 tempest-ServerShowV257Test-663276056-project-member] Flavor pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2466.771836] env[68906]: DEBUG nova.virt.hardware [None req-427413b8-758c-4244-881d-f397aaaba2c1 tempest-ServerShowV257Test-663276056 tempest-ServerShowV257Test-663276056-project-member] Image pref 0:0:0 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2466.771982] env[68906]: DEBUG nova.virt.hardware [None req-427413b8-758c-4244-881d-f397aaaba2c1 tempest-ServerShowV257Test-663276056 tempest-ServerShowV257Test-663276056-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68906) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2466.772223] env[68906]: DEBUG nova.virt.hardware [None req-427413b8-758c-4244-881d-f397aaaba2c1 tempest-ServerShowV257Test-663276056 tempest-ServerShowV257Test-663276056-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2466.772381] env[68906]: DEBUG nova.virt.hardware [None req-427413b8-758c-4244-881d-f397aaaba2c1 tempest-ServerShowV257Test-663276056 tempest-ServerShowV257Test-663276056-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2466.772546] env[68906]: DEBUG nova.virt.hardware [None req-427413b8-758c-4244-881d-f397aaaba2c1 tempest-ServerShowV257Test-663276056 tempest-ServerShowV257Test-663276056-project-member] Got 1 possible topologies {{(pid=68906) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2466.772707] env[68906]: DEBUG nova.virt.hardware [None req-427413b8-758c-4244-881d-f397aaaba2c1 tempest-ServerShowV257Test-663276056 tempest-ServerShowV257Test-663276056-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2466.772877] env[68906]: DEBUG nova.virt.hardware [None req-427413b8-758c-4244-881d-f397aaaba2c1 tempest-ServerShowV257Test-663276056 tempest-ServerShowV257Test-663276056-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68906) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2466.773738] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-73f651c3-b6bd-4cf8-8f11-058e33c339e7 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2466.781568] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-12f09c5a-c2e5-45cc-abc3-e72b1ac7aa90 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2466.795013] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-427413b8-758c-4244-881d-f397aaaba2c1 tempest-ServerShowV257Test-663276056 tempest-ServerShowV257Test-663276056-project-member] [instance: 456ee279-7956-4307-92a0-723cb528c531] Instance VIF info [] {{(pid=68906) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2466.800554] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-427413b8-758c-4244-881d-f397aaaba2c1 tempest-ServerShowV257Test-663276056 tempest-ServerShowV257Test-663276056-project-member] Creating folder: Project (e314256d7abf47dba6c5ca6e46b97008). Parent ref: group-v694750. {{(pid=68906) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 2466.800807] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c5cc9fda-ed11-457d-b493-b5d029badfd7 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2466.810740] env[68906]: INFO nova.virt.vmwareapi.vm_util [None req-427413b8-758c-4244-881d-f397aaaba2c1 tempest-ServerShowV257Test-663276056 tempest-ServerShowV257Test-663276056-project-member] Created folder: Project (e314256d7abf47dba6c5ca6e46b97008) in parent group-v694750. [ 2466.810938] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [None req-427413b8-758c-4244-881d-f397aaaba2c1 tempest-ServerShowV257Test-663276056 tempest-ServerShowV257Test-663276056-project-member] Creating folder: Instances. Parent ref: group-v694862. {{(pid=68906) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 2466.811172] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f7d7bfee-9509-4882-9cdd-34b313a940b7 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2466.819774] env[68906]: INFO nova.virt.vmwareapi.vm_util [None req-427413b8-758c-4244-881d-f397aaaba2c1 tempest-ServerShowV257Test-663276056 tempest-ServerShowV257Test-663276056-project-member] Created folder: Instances in parent group-v694862. [ 2466.819998] env[68906]: DEBUG oslo.service.loopingcall [None req-427413b8-758c-4244-881d-f397aaaba2c1 tempest-ServerShowV257Test-663276056 tempest-ServerShowV257Test-663276056-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68906) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2466.820220] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 456ee279-7956-4307-92a0-723cb528c531] Creating VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2466.820412] env[68906]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-8118bec8-b914-4569-b36e-42e6efde9d67 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2466.835601] env[68906]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2466.835601] env[68906]: value = "task-3475491" [ 2466.835601] env[68906]: _type = "Task" [ 2466.835601] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2466.842700] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475491, 'name': CreateVM_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2467.346057] env[68906]: DEBUG oslo_vmware.api [-] Task: {'id': task-3475491, 'name': CreateVM_Task, 'duration_secs': 0.233805} completed successfully. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2467.346251] env[68906]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 456ee279-7956-4307-92a0-723cb528c531] Created VM on the ESX host {{(pid=68906) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2467.346736] env[68906]: DEBUG oslo_concurrency.lockutils [None req-427413b8-758c-4244-881d-f397aaaba2c1 tempest-ServerShowV257Test-663276056 tempest-ServerShowV257Test-663276056-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2467.346938] env[68906]: DEBUG oslo_concurrency.lockutils [None req-427413b8-758c-4244-881d-f397aaaba2c1 tempest-ServerShowV257Test-663276056 tempest-ServerShowV257Test-663276056-project-member] Acquired lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2467.347365] env[68906]: DEBUG oslo_concurrency.lockutils [None req-427413b8-758c-4244-881d-f397aaaba2c1 tempest-ServerShowV257Test-663276056 tempest-ServerShowV257Test-663276056-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2467.347640] env[68906]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-fe5af8cf-9c7e-42d3-adbc-5420d31a9f32 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2467.351734] env[68906]: DEBUG oslo_vmware.api [None req-427413b8-758c-4244-881d-f397aaaba2c1 tempest-ServerShowV257Test-663276056 tempest-ServerShowV257Test-663276056-project-member] Waiting for the task: (returnval){ [ 2467.351734] env[68906]: value = "session[52a3cb0d-1212-490c-2669-91043b4da4d8]52324454-c9f7-9880-6115-b73cbb527698" [ 2467.351734] env[68906]: _type = "Task" [ 2467.351734] env[68906]: } to complete. {{(pid=68906) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2467.359867] env[68906]: DEBUG oslo_vmware.api [None req-427413b8-758c-4244-881d-f397aaaba2c1 tempest-ServerShowV257Test-663276056 tempest-ServerShowV257Test-663276056-project-member] Task: {'id': session[52a3cb0d-1212-490c-2669-91043b4da4d8]52324454-c9f7-9880-6115-b73cbb527698, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68906) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2467.861610] env[68906]: DEBUG oslo_concurrency.lockutils [None req-427413b8-758c-4244-881d-f397aaaba2c1 tempest-ServerShowV257Test-663276056 tempest-ServerShowV257Test-663276056-project-member] Releasing lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2467.861987] env[68906]: DEBUG nova.virt.vmwareapi.vmops [None req-427413b8-758c-4244-881d-f397aaaba2c1 tempest-ServerShowV257Test-663276056 tempest-ServerShowV257Test-663276056-project-member] [instance: 456ee279-7956-4307-92a0-723cb528c531] Processing image b1400c31-d33b-4e13-944f-4c645e62493e {{(pid=68906) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2467.862076] env[68906]: DEBUG oslo_concurrency.lockutils [None req-427413b8-758c-4244-881d-f397aaaba2c1 tempest-ServerShowV257Test-663276056 tempest-ServerShowV257Test-663276056-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/b1400c31-d33b-4e13-944f-4c645e62493e/b1400c31-d33b-4e13-944f-4c645e62493e.vmdk" {{(pid=68906) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2469.141074] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2469.141371] env[68906]: DEBUG nova.compute.manager [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68906) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2473.137641] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2474.140656] env[68906]: DEBUG oslo_service.periodic_task [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Running periodic task ComputeManager.update_available_resource {{(pid=68906) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2474.151987] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2474.152281] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2474.152459] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2474.152624] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68906) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2474.153819] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1dd18473-0dfb-4b72-8a23-96ecf62c1fa7 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2474.162703] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-76a08e4e-ffe8-4b3d-a9d0-294f18b538ba {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2474.176381] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-259a199a-ae75-47f9-b20b-867cb97945cc {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2474.182452] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-98784bdb-287f-4c8f-9333-0ebca7334c5b {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2474.210652] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180956MB free_disk=93GB free_vcpus=48 pci_devices=None {{(pid=68906) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2474.210782] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2474.210975] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2474.291448] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance d70b039d-c8ad-4ffd-84f8-08f17cb97578 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2474.291608] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance ed276c3c-6085-427d-b3b7-86bbb8660dbc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2474.291736] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance cd208e67-55a3-4c0b-ad49-abd3a700d5ef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2474.291862] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance a4f1c6a3-c189-4e3a-8ac9-ac6ec8b95723 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2474.291992] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 0d12bd8f-0e92-4066-9ada-6eff7b4c5dbe actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2474.292128] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Instance 456ee279-7956-4307-92a0-723cb528c531 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68906) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2474.292368] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Total usable vcpus: 48, total allocated vcpus: 6 {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2474.292434] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1280MB phys_disk=200GB used_disk=6GB total_vcpus=48 used_vcpus=6 pci_stats=[] {{(pid=68906) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2474.372835] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba47d139-50ce-499d-ba85-cc115201c2cd {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2474.380137] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b0581040-5157-407d-8452-097f90efbfe1 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2474.410325] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-510f424b-2377-458a-9cc4-76ecbbc5faf1 {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2474.416885] env[68906]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b1af91b-c527-4845-a34e-78e288b0fe6f {{(pid=68906) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2474.429218] env[68906]: DEBUG nova.compute.provider_tree [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Inventory has not changed in ProviderTree for provider: 1119f6db-bfd7-4ef3-bdff-5c6974dc249b {{(pid=68906) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2474.438218] env[68906]: DEBUG nova.scheduler.client.report [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Inventory has not changed for provider 1119f6db-bfd7-4ef3-bdff-5c6974dc249b based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 93, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68906) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2474.451164] env[68906]: DEBUG nova.compute.resource_tracker [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68906) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2474.451342] env[68906]: DEBUG oslo_concurrency.lockutils [None req-fd313f3f-1215-4daf-a803-1c6f373905cd None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.240s {{(pid=68906) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}}